var/home/core/zuul-output/0000755000175000017500000000000015157107253014533 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015157113631015474 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000256706015157113554020275 0ustar corecorelikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gf )?KEڤ펯_ˎ6_o#oVݏKf핷ox[o8W5-b6"οƼ>UWm׫Y_?|uݗ[y[L-V_pY_P-bXwûxwAۋt[~ _P^~&RY,yDy~z]fs,l<L& " d :o5J=nJw1f /%\xiƙQʀClxv< |N ?%5$) y5? fۮ?tT)x[@Y[`VQYY0gr.W9{r&r%LӶ`zV=Too|@E1%]˜(O)X(6I;Ff"mcI۫d@FNsdxό?2$&tg*Y%\ߘfDP'F%Ab*d@e˛H,љ:72 2ƴ40tr>PYD'vt'oI¢w}o٬owko%gQ(%t#NL֜ eh&Ƨ,RH 4*,!SD 1Ed_wkxdL3F;/u7Taqu5Ոӄp\2dd$YLYG(#?%U?hB\;ErE& SOZXHBWy|iZ~hal\t2Hgb*t--ߗ|Hp(-J C?>:zR{܃ lM6_Oފ?O1nԝG?ƥF%QV5pDVHwԡ/.2h{qۀK8yUOdssdMvw`21ɻ]/ƛ"@8(PN_,_0;_x+Vy<h\dN9:bġ7 -Pwȹl;M@n̞Qj_P\ Q]GcPN;e7Vtś98m1<:|a+.:a4nՒ,]LF0);I$>ga5"f[B[fhT/ɾg}\Sj#3hEEH*Nf äE@O0~y[쾋t=iYhșC 5ܩa!ǛfGtzz*з 55E9Fa?Zk80ݞN|:AОNo;Ⱦzu\0Ac/T%;m ~S`#u.Џ1qNp&gK60nqtƅ": C@!P q]G0,d%1}Uhs;H?)M"뛲@.Cs*H _0:P.BvJ>mIyVVTF% tFL-*$tZm2AČAE9ϯ~ihFf&6,֗&̴+s~x?53!}Z[F)RH?uvͪ _5l *7h?cF_]CNnW)F5d,0SSNK9ް4:ÒozsB<^+鄌4:B%cXhK I}!5 YM%o<>"ہ)Za@Ι}YJz{ɛr|hxY/O$Zøu32EʉD'MS1}t i:Y`cФIX0$lη˽`!i:ګPSPٔ3@5;ȕ}PkڪH9' |":", 1Ҫ8 %lg&:2JC!Mjܽ#`PJWP4Q2:IGӸۡshN+60#:mufe߿~Y,iǑ wVq*T+ w%fx6 %u̩1hӰc%AYW ZY~a_6_yWf`rVA,f=A}h&VOK(BSsǽҰ%>kh5nIYk'LVc(a<1mCޢmp.֣?5t罦X[nMcow&|||x:k/.EoV%#?%W۱`3fs䓯ҴgqmubIfp$HhtLzܝ6rq/nLN?2Ǒ|;C@,UѩJ:|n^/GSZ;m#Nvd?PqTcLQMhg:F[bTm!V`AqPaPheUJ& z?NwpGj{VjQS,؃I'[y~EQ(S +mpN, Mq 70eP/d bP6k:Rǜ%V1Ȁ Z(Q:IZaP,MI6o ޞ22ݡjR:g?m@ڤB^dh NS߿c9e#C _-XѪ;Ʃ2tStΆ,~Lp`-;uIBqBVlU_~F_+ERz#{)@o\!@q['&&$"THl#d0 %L+`8zOҚƞ`wF~;~pkѽ)'cL@i]<ք6ym®Yi&s`dyMX](^!#h k:U7Uv7чd)wB5v-)s蓍\>S[l52, 5 CۈP$0Zg=+DJ%D  *NpJ֊iTv)vtT̅Rhɇ ќuގ¢6}#LpFD58LQ LvqZDOF_[2ahwfm#Y~!%rpWMEWMjbn(ek~%Qi /2,?O 0D"KjPQ>YPᏟS)ZbF p$^(2JцQImuzhpyXڈ2ͤh}1ieQ*-=hiך_%הrΒ]rύW -e]hx&gs7,6BxzxօoFMA[D,{'$ь'gISzp; AQvDIyHc<槔w w?38v?Lsb s "NDr3\{J K纂P//Q۲6+ktwMzG,87^ 9H\yqū!)\(v8pHA"ΈGVp"c ?X)hm.2;sl$瓴ӘIe~H|.Y#C^SJĽHǀeTwvy"v܅ ]?22R.lQPa ˆS73c +A)9V3]rqkeK\4? 8'*MTox6[dB Z^-ɤV)v$$m!-W'wTi:4F5^z3[7{1LK-w7PYEB>-jyL'8>JJ{>źuMp(jL!M7uTźmr(Uxbbqe5rZ HҘڴ(|e@ew>w3>ݍpH%Z{{Q=.x5 T `7ruuŨԀ![Z !iHlf[7Ua6BEZd9 NpydqZrS6U A~@Ve Ȇ*d96 FuQƈkmb]osl pRpvUEM.wuZ6]( 1aVf~xM>/!~ P󂏛o n8Dkb^so&a[s~W &ɿ^\r\ߺnq\V@z-=\#|-3ڝa$NM[` R?da,ȍoءfQ 2ؐfc}l 2窾ۉ1k;A@z>T+DE 6Хm<쉶K`'#C5CL]5ݶI5XK.N)P!>ztzpQC ¶.vBTcm"Bkp r`jﺧK]0/k<%dzM2dk–flE]_vE P / څZW`9r| 5O uw\h `>6зd B0C?]lja~ lzkCzFȏ>1k*DlS\b`lz}mAЦxCƨ||Yt,=d#uЇ /+FUM 'ul)b ;2n6Gk=Fnx ˭j}tu,R|ۯG8`&o+1wu(#'3"fxkuҮױdy-0ގx[O1ߕejw+n>9`2ŦzfhѤēA$:A ۥ͟յeP}@.供cЃ??w w@KvKts[T Sa oZaDžP qZư07ݾ~w3n:U/.P珀Yaٓ5Ʈ]խ4~fh.C>n - kTuwmUr%ԀjƮĀKSM#^ۈӕ3NeBO.xiᚡmi@zF(n&P;)_]µ>dg)p|Vn60֭l$4԰vW`i{ 6uwŇCtyX{>GXg&[ņzP8_ su0,gy(&TI{ UܳN5 d͖h"褁l^m *#noQ!z]1,ʹB֒HֱKoG w 9p{M^{Wϳ\{_1N`Ac2 GP)"nD&D #-aGoz%<ѡi (jF9L`fMN]eʮ"3_qoT za%ĉUHSR0=>u)oQCC^&pn08p15w q L:@۬˨vvn/sc}2N!DDa(kx.L(f"-Da +iP^]OrwY~fwA#ٔ!:*땽Zp!{g4څZtu\1!ѨW(7qZcpL)ύ-G~rFD+"i)yh=x5܉Q~O_y琇HBzI 4΍ +aM(VldX ][T !Ȱ|HN~6y,⒊)$e{)SR#kהyϛ7^i587f4PmB8 Y{qeφvk73:1@ƛ{f8IGv*1yx27M=>+VnG;\<;^u'hd xе"Q4SUwy x<'o_~#6$g!D$c=5ۄX[ു RzG:柺[ӏ[3frl ô ހ^2TӘUAT!94[[m۾\T)W> lv+ H\FpG)ۏjk_c51̃^cn b~-X/#=Im41NLu\9ETp^poAOO&]iSCQ&s~In/SZ % 'I Ƿ$DEu&ݛȘPˬ-Ő\B`xr`"F'Iٺ*DnA)yzr^!3Ír!S$,.:+d̋BʺJ#SX*8ҁW7~>oOFe-<uJQ|FZEP__gi(`0/ƍcv7go2G$ N%v$^^&Q 4AMbvvɀ1J{ڔhэK'9*W )IYO;E4z⛢79"w!ͤ Q.1APdд.Ar:h@Zy+H3p7ٞ%eOi9u[ txYΖx_eɑvťJ*V.0+^ԧFIcu '‹y9Hj }1f]fsQJIVrQWq>N ̢t-mfeF;gUаQ/ .D%ES*;OLRX[vDb:7a}YF30H #iSpʳ]'_'ĕ -׉6tfЮ$zͪO_sYq+q艻*vzh5~Yy;,DiYTP;o./~^.6+zZFD& m@WXe{sa 2tc^XS?irG#^ŲDI'H_Ȯ;RJ&GT.Kwj;of¬zHmmS2ҒN'=zAΈ\b*K ڤUy""&D@iS=3&N+ǵtX^7ǩX"CA⥎å+4@{D/-:u5I꾧fY iʱ= %lHsd6+H~ Δ,&颒$tSL{yєYa$ H>t~q؈xRmkscXQG~gD20zQ*%iQI$!h/Vo^:y1(t˥C"*FFDEMAƚh $ /ɓzwG1Ƙl"oN:*xmS}V<"dH,^)?CpҒ7UΊ,*n.֙J߾?Ϲhӷƀc"@9Fў-Zm1_tH[A$lVE%BDI yȒv $FO[axr Y#%b Hw)j4&hCU_8xS] _N_Z6KhwefӞ@蹃DROo X"%q7<# '9l%w:9^1ee-EKQ'<1=iUNiAp(-I*#iq&CpB.$lٴާt!jU_L~Tb_,֪r>8P_䅱lw1ù=LAЦz38ckʖYz ~kQRL Q rGQ/ȆMC)vg1Xa!&'0Dp\~^=7jv "8O AfI; P|ޓܜ 8qܦzl5tw@,Mڴg$%82h7էoaz32h>`XT>%)pQ}Tgĸ6Coɲ=8f`KݜȆqDDbZ:B#O^?tNGw\Q.pPO @:Cg9dTcxRk&%])ў}VLN]Nbjgg`d]LGϸ.yҵUCL(us6*>B 2K^ sBciۨvtl:J;quӋkKϮ듃ԁ6Y.0O۾'8V%1M@)uIw].5km~Ҷ綝R(mtV3rșjmjJItHڒz>6nOj5~IJ|~!yKڮ2 h 3x}~ے4WYr9Ts] AA$ұ}21;qbUwRK #}u'tLi'^Y&,mCM)eu㠥Ѻ\a}1:V1zMzT}R,IA e<%!vĉq|?mtB|A ?dXuWLGml?*uTC̶V`FVY>ECmDnG+UaKtȃbeb筃kݴO~f^⊈ 8MK?:mM;ߵoz+O~e3݌ƺ(ܸf)*gCQE*pp^~x܃`U'A~E90t~8-2S󹞙nk56s&"mgVKA: X>7QQ-CDC'| #]Y1E-$nP4N0#C'dvܸȯ.vIH"ŐR ;@~y>Kv{) 9AG ćͩ$.!б~N8i"1KФ\L7/,U@.ڮO?mُa ې!rGHw@56DǑq LA!&mYJ*ixz2*{_;IYJXFfQ* 0kA".mݡ"3`Rd1_u6d逖`7xGMf}k/⨼0Κ_pLq7k!dT x삖A7 u/~&ӄMu.<|yi I?@)XJ7{ޱ?Q]{#\4ZfR-dVaz./f+yGNMGOK?2_~3\z=y}^G$*A! IcuR.o=MZ9zu b#s9@*иrI@*qQN||Ix;I}&ݢ6ɢ}{]x}_o>Mm8S]~(EX{޹na4p9/B@Dvܫs;/f֚Znϻ-Vڄ`[nUgu$ B6 [^7 |Xpn1]nr CC5`F`J `rKJ;?28¢E WiBhFa[|ݩSRO3]J-҅31,jl3Y QuH vΎ]n_2a62;VI/ɮ|Lu>'$0&*m.)HzzBvU0h} -_.7^nya+Cs 6K!x^' ^7 l 2Jj.S֔(*CjaS:vp/N6I*x8"EȿQa[qVM/)fpOj4r!:V_IG^nILVG#A7jF};qPU嗈M9VS;a+Ӧ8E8zmMs*7NM~@6 ' 8jp*:'SOANa0rӍ?DT%l)gvN}JT(Ȋqm|dc+lQai,|Dߟ|, d#EjZܴv]pEO7}&gbXԈedKX :+Z|p8"81,w:$TiVD7ֶ]cga@>\X=4OZSܿ* %xccDa.E h :R.qɱMu$ơI8>^V Y. ,BLq~z&0o- ,BLqfx9y:9244ANb n\"X>Y`bb*h%)(*_Gra^ sh6"BzƾH( ."e)B QlKlXt҈t9՚$ضz]'.!-r"1MCĦʸ"66pE{ =CNc\ESD[T4azry !5yY~ :3;Y[Iȧ q:i Ǟ/"8Wxç,vܰtX-LE7 |-D`JLw9|fb>4Nu ߏ3ap5k_JA+A.A~ C~`[KaQ-Ģn9ѧf q:cT >to^ X]j?-ȇlCf0hM`~ ó}0W@o  K[{d+`ze"l |d;L2k%x90ݙ^Oe ]nHfS+.4<#/5߁ݛǪ0q,7FeV/!; 瓠 Li% z}ɯww"O-]J`sdN$@"J`Y13K/9`VTElsX|D^c%֯T][$m;ԝ!,Z5f`XFzȁ=nrSA8; P=uY}r/27OUa%~0;үM3Tu ȩ*'3IC~LG,?.?C3tBYpm_g.~>3ʄ55[c&-Wgy_jVo,?s w*n\7[cpMY<~/"˘oV܉T6nn \_ߋV_}Z=k-nn sn.*upw pX\_ U-C_wS!|q?E-S_w$-#9?mh{R 4ѭm_9p -h2 dֲ 1"j {]]Nk"䁖%5'32hDz O\!f3KX0kIKq"H~%.b@:Oec6^:V8FDza5H`:&Q5 ^hI8nʁu EA~V O8Z-mYO!tO֠υ9G`6qmJc,Qh: ݢKNw2taC0Z' O > f-`:F_Ѫ2)sCj1THɩhS-^p b~?.>, `0!E%ҏ:H =VՑӄ| Ć.lL t1]}r^nʂI-|i*'yW='W6M$oeB,޳X$I6c>EK# 15ۑO2Jh)8Vgl0v/eNEU"Ik dRu˜6Uǖ xs%P ع omWl҈sApX!^ Ɩgv{Xn|$̇d`>1Ljn떚F+B9l"UP۾u2Ja>0c0Vvގj$]p^M+f~@9{bOe@7ȱ^%u~-B竟} |23 Z.`oqD>t@N _7c$h3`lg\)[h+pHBr^J |r\8czEnv@qZbRT1e8V Scc6:$[|a.fpU`ZR֩bKgTlѩynۢ, "1LӰW&jDkM~# (C>ϭQ3{ߤ%EN;?P%ٱm -{2k 8Vbv"wŏݙmn&O1^'}plM)0\n ή ?Cֲa9H] lX9^vCο -vd+OUgRy2Я\ B0!% #>bJPUck\Ul'F瘏Y4Ew`[x٘p,>9V"R1I>bJ` UL'5m1Ԥ:t6I >jz(:W֪Ƹ)!fꠗe[XLE4atGS1px#S]MF˦NJPYDX%ܠꡗhl}i9f?q>b-E'V"mNf""ŦK9kǍ-vU #`uVi<s)/=r=nlӗЩsdLyVIUI':4^6& t,O669Ȁ,EʿkڍfC58$5?DX 4q]ll9W@/zNaZf% >Ę_"+BLu>'Ɩ=xɮ[⠋X((6I#z)2S zp&m?e8 "(O+:Y EaSD]<^(]В|Ǚ8"oRs?]\McZ0ϕ!1hKS`h0O{!L-w]ln2&0Ǚ'0=.T4G7! H/ͺ|@lX)+{{^s1V63 ۗI"*al NJ`Q8B\pup6_3XqCXznL9:{o qcuו8`n{ave=}OR9~yL Z1=W8>É R$|L ]OfJl˪VVg:lDԒ͢Zu[kWۗw{{7st08`J0ꨴU1|z:9dX)z2!S:'q9| 76"Q;D*04Zٚ ?V¼r8/G:T6Fw/ɚ~h?lUc3MکEen壹n\殸,˛_uu.Jssu/*47U0)l?R_^Uon̝f-nnZTeuu nn/*0׷տ·sHH?Et _I`[>>0ւcz/Adh.$@bѨLtT=cKGX nݔ͆!`c|Lu_~ǴT?crO e9d ljB?K_z>p%'3JQK-͗R>KkΤOq,*I|0]Sj%|-Ԟ = Ʃ%>H&t;9`>$& nIdE Ͻq*nŘʰNҁػ޶]W|_*[@ݞ4mlE0ҌlzqIcm RCΐU\D 8%HcWqןl)*L&%s`6E)*ILif~jPp.<"*D)E\iYe\ɻƦ[HcRD@%dZ'0ZŨy4eUo;SYYq{\~)ϣB6Id S{x!_3?Z5SB$Gk)Xb5Y@NՌ1VU1ރ?VUX5sj |34HFwc;U]A_#\ȱ ^`4Ldz͘|i=N!`Ǚe?"'9읣$9y锔ay))[1SR 2t #xi|L#t^_kc/C=d4_WF̕1 +ssuF eqlCZ3^WD&Ba:!N 9n$MZ3-@n淅0x\z9vޫ5ُ˶}9oPd0*4,_nsayOZ{D$` <}w "GS=tg^wZe!%c^|zAzx?#i2tF$0KO7u2Oprq|W/IOS-D_>ħ K=; ah9\׸ FawGPaZ^=ـJD2 ׋D/U^;fpOaZe\z~ZQ]TYgFd#--#4cF`~$Y# |ϰm@Qh5yia!9. -ݰq*5"A @l"dLZW,jR𒧠XZM2z!'}r5 Q;Uu\ɢm:>oPŶv"뫼/i<0 by_ 4;Npvu@ K8N>9~ϰixCO><8548(L<_UÞ7 yS#.[LAwh& ;'ﮯo?wYɒ{<}|ڵ]=4Q wװ|3|:y{L`04`BWg`O"ݽb!KXG`jC&I8BgY;zU}98r^ U8v"2M5@7њEctw_GB(`{^~Cp/G8C]fcd~KZ|n:O1^6 =ܪi',4zݡC_BklW 8#Ru`**eNmCRQ%X.<6˼)3^T|*}sW]x _t -8.G ^JY In÷0qn8wW߿Oϗw\mXywr[^~ʧ:6|"%>=6 =6!!}f3K@m˘FXDQl{+%<)g+* ~%p͆I vN*j*LlCJ*5OYi"G oI,qTTS'"AY%y,vXt2wEv{˕NKuNMk8ϥ[filKgaW᯶ж%~/mhOeUȥ7̈]5e!2+W͇ՙbh0/<2}Atq.=QJ 3iwZks:qomy! x§"e( m-hJu%ASV$V,U^H؊1,K>{$dě>wC{5,R!h@4ӷ$Ԟo%UlKo:qJ2[`&)V<{=̔em>l v1)ؼk p ch#dJkے`OjӵqatW}MDokͲ**MɈ3$7UU ӎmJY}r<=ֽ?>E,Ee 8ZhU{{5퓤o\UI)fhIn#S }FPr`0Ϲ`Ã/ _V4P s9B[ZS]xX.Zpv^3f[o94Ee^$|+0eGFzoF1zV0.D>-drU6ݛhbНfoJәZ dXpAj݆q΍ OUWCeysmtn& !v3yƋj4(ɀ?AcyI{&hda]MGz_Kq|G;E9IONiwW6] o8+wqwolqE6`6o-Ymm'IsA͐LَJ-j[g1$ό/Nd-ts>5J.r? gn\Rؑ+V*l@@Kgo^.|4aބp\E+V2ZAӕIm8+B)=~7KknXŽnl uaD=9EG2ķ$sUi?nmB1wiNOe.q45+aQfţBE,6YFڤ$scrh>}VMT5^g\y񡁟G}vBz~tE]3XǨivi>?ŊiL :'@`*j('Z( V`/AksV@u po_]Zǘ)Drxvɢ^ w :@;b A(~W@v 8\ Y@g3@#ט 4vN w.x>'X3 eVy N+X}V2 genz7>=": DGo; 2T˴! ##,v@|Nr8<+~ƩtN[r }} P4AC 04Wȯ@=lq( \<ỳӮ(Y=1hDyP]Z<*0]/4 .0=jra:mY_Y!GZ^NOpkXMڦܯV)"ɦwEmF>&,756K확̎4w iٝs~KPF bpsۜԹWcj;`7qw:BTcjL)e q1U| XfIpMv6t(S88j7ǩ5eN5N.bN,ijLʵL>0t^y$Ccs:!ɗJ{.2Dn7a>AU|FbR&_f)aNd=fl ȇgWW `I`O:&.$$UaA1o¤"_" e)T |ԺS=0%KAbĥ 6|N,Ls2ʳ 1֚f9dk`xٟ%CDm U 7iCU~DLȕt|.{RLTE/nU#eJѥOc5[{.0}q1UH.xn0:2CP'!&Q,ƌ&e Шz^DA0H`)벏(þ񭚶J|TJ{`{HpGb !dDtQa* +sYq=ϊ`0`=EPeQ.xe}@c2f{}j"OaWH9Tlmhz*f KP%b_F>q|~Bn(ɥŜ^@&Z`[?lW1.>+fXXؖQ$]]=WY FG]1O&RtTz^!OP\@!'>;.͡:q"=ˉYoCЋ0 qr3C\3{&=ӆ/2Au{r #qbZ;4{#s=&̲ȥurIKt5G)W~,_/_O '.B<0$'c3'QAGJj9ᚫʌxOenmjf뛵\*/kXw5 XۄPoºu-孕L5ךY -jZ,^*&pw6Nm522 4Z=b˱fà2Zf4r|dgɋhGtt]L~v}W6.  gWby(4dzD 8wl2ӲLGYp/ilC|Yvl VЪ.34hjZr! wުQϤ^dY#j=}; (ȅPa#۵8N4{퇱8q#ac~[v/ZGЎݞE=;fhP d7O˞Y|]?Add0:EU@="_â|J5Bu(mYwms(}n. Aoxkax}.1FMRZ+͢GaSQ?\c7BkYrذ G}namJQ,?A!QPM$r\'ٕ>i8#GU%e0QL-c6"u/`L)fXiHM"aKh?c< F!4A Ή拀&*Q_(v \|Z> ߥ]+u7>y\8l< 5k[L(Õ-i>M7>ZAl^Ƨ-/c-Oϯi yVXtJ>Lʹd c!n^SgNEAD.|Yfl%ט[]614Lc;ijMr lLBnQ7%ż2)Z+2`@ۀƽ낞 JIC[U~-(m Jw.(OPHA.4)ݥM#٪JwjՆݩeӴweMi²ee))AP (]PZ;j.#vn j ~ڏAP): +Wgu9o*=uԓ3. '3)%bˤMbI ? T|պ qC dNv$#X, 8H$ 1lhb4Eҡm?+T~Kyo^9Ƿ5Ima6_1 Zg?3C|wǞwnyeՏd|0 utzwu]>>GCt13ϡ]닷,*o2$>Je`BHna~P+$8i+r c@t#3FV8n4}q6˿zm`ƾiy!$"$Xd!Rڈb+o-HJ0]mHQFqYS$QkNNo>?kaTsW SèFfekCz$PZdoULy sX$ GsdUew}D$)!ʁ(HS<DTw[{#|Èl D]?%zN@ ׃ 3<%2i;`; Λ(8x'g:8 w.V S08 !BJ,LuB1__Ya{Iބ-drsD?0vM\%ՕQ߫|!dhW}5 F]#vX=ǹ/ ? }UE$%,bA 7'QA UFƖ'㴞f?5̾ Sc6~8 N6Rm)˞ͶK J֩ZTO_P{׽w"@~?[e}9w]ZW|\ NGו}>[8o>у1׃1:YtS+oJ/LKm,*-gS?*_1؋f̑up-gJ<X +F%p02Zc)uZ /U 1옩‘=.}={ TjxQG h 6 eqʊF|qK a+n9kuʬ`Y[0xބԩIv{{'эQդ3&*5,9Y :{)6 8F̩R`7_Ecf댇`XMBRi:WנL}B_1X(\oq AjYR$ i_]M| !k8Kta ^_HpTnQrGY)i9O8lo.p&epye ,D&>$U%F҅Z~{x=lHph)5odch +ִ LYL\s/#=IS5GM N tf} ࢎ-*dЈ ho..N+>G\[Ts>ڨxgsw[A--":/qkʬ%gXСGw:岍 yDƆ X3i`\_Wgj XQ<$vsl *,VG|ժËp>#"޼ vlA,v"a[۷ V+)B?b煑l48?| r~2fumӪ& V7y`ɤ42e-!Fܭ6ߧ {?U+BH /Pu%}1߀^-4=<¢Z]/D+Y:p,8kZ;#EegQEP/G1ؙg2駆XI2d3OYvH̒[K$8>׃ {I\9gK ɗ5(q.N3ǧ'MS"J!Fhc#`$ !2 +G;rED֬6ev30RpZ>$JmZ0PHӏ6Ff2pJa{/E9uo1hу2c H>Ta&%#(UAVa0>&IVN]Y~$A4ŴQsDz_G/&Ich墩Z7 W}djO 0|T1Ľavg ǎ|sK$1 QL.:j2U2jvz$¢c=Inm`1obd "7`!Sct`f>[xݐzfN0|%L.sU8(`iEd4$t\JmfhaQZx<(<ҥGI trQ 'W$SMs :5D9hrJ[dM}Gv[H4cyti :TLl-Ȋbˆ!kr\K(*EQ ?DL[`Z-.kRb-#/1.Y N2yvRF8Ǎ"8Cc7neVOeO\\$HnۨA_,t HqT#\ -SV[*(|Dɵ?'cbĐg&AE1p3MoJY Y,Z'Jߴ $xm*QR%];e@HQnjړ͏t@ |hfӚ!0k)eL;!B u#`kY>A >bpq)O fĉv`g5x.ԖRصϧW,E/rN utKy]3zn),AHbybi} =SVjQgQ;5Wa6SyC#䉾7i=ӐcleiBH(b݁~$qp~ UL52t@.d0D .HЦ P91آfq`Bnhw?$tfp˄#e4? ZJEMf9.4Ċ`$]oARh]YQL_fG8#;tHx<0!YBZlc`[El\I|G'0ĸd'^o̴VK 09.yU!=5>{Vb}!npw̵>ը+N)"&Ua^^a.Ţ(k.[{#l\`J)<Xx&V]c$]r0L`bĿdޗxlcS陸~||3oG'pĄ7>/ & f Qj#Jr>>;G`jkg$fX̘E9Vo$qp7Z4ŇOD\`6@]^ P u [צ3LZ|1IpTR;5CyxWw4%| KԯMfl Qav\":(y[DHPb,I>P03t>YVyI6¬%2F|2\6@>D` o.p&eݢU (zkRa]1bțq|Q3ILE?x`XfX謠\ vtܯ̝HF=gB5;ѥĽS>zTg!8T2Gkz:]3tHӃƭ2ŒSJH˼(&W*UOKVva;sϳ553ׯΜ$j&Z5BˢMڿxh?|mzlײCQP*?xc`fDJ僼|$ ]m➑[a/["Q| H>Y\՞ăakjwvxԗ~>dfAĬjh!fȫ(fmhqM7H__z_o~[m_1QL"Rx%ΡF)mlZ5p>-׻6c>5o+4gA ]P",m9B"teA0aB4x 4(F(Uk*&[5xCf7]BaWHQs&IF5dhquD㤫[mH<ki_eisUz?P s&cYz5홰|{m۴_ $vթ_::uJ)jU:+_ȞtÛmʓbLJ\ PGp0z  Sєv-,漎y!!:ƟW 11@w8N9F 7u]|bM(ʦ( ("م1g&uNס`T/g 4!hNA3z2 l`0\ =Z陞yZ>vs1lf4wc htg@(XWp #zB~ݘOaz) ﳋLy0 x+ v N 38NzZc |'hJ<~=8AO݆Q~Mz fр>'s3ӓdȃ}| T0FӃ$iOѥvBC{q=\ P4NfTcD=&&Ӫ#|%_0IαahA D̯rF}v4*.Y}G] m,k4gSJ.DW,.CˍbiүZSw:`|dq4-pM9LK/>ӻX{S꺠U[$ 纛n, `(Hvg,y[=ry'Jxy~ HΌ%3Un6Lb:Tط@S+\dV{qedfï<_U|׈85W"b"ie7/;̸wu "Fϰ";YBn g M7~89(9Ͽ=U|ǿ7yG'_zRUF}aͲߑ5_:X _N'e+ͯ1fR,gth,I+@YJdlBkY/ޝƗ55M!-^, +à zܣTIMfz1祆6E}I5 #w7I6w p,?r>?w:ҽ;Q/6xAE<=6yvj/k˼?k9+Gϊ FJO^S^aѴ9H`6#Z}tFvHi;TˆҴHӃW3!()A r9 \5Ux.Q]'/irigK0In>Q[ޛnBJZQLY/VeRa9 lKs0d:Y*ÓEP)OrPgzʜ =v8"lVl_[2? B4W^k*ˋ'8KuI3f5m@sٖ%F$K ɤP۲u^O;|ps@A[`ҽԧ Z-5/OTCq䂦5?òHvAuhp+c_K /1<~¡D̀2cKHtPJAˍgVf}AzQ]=d5ęPhReS7^F] _n |0]kHNiSDdmO4Y@ppѳ371a.H{QL+t=%] .PRc"C1dOIޢsjDǏH6o=k{fK߷?e)V;{] E&:-TG7 g(r8ޮsE ЍTJ+PZZg8C$@?fdehQ#hye:gxhPD3dq8HXW OB&jZ"QI+sFӎqFpDXs:1F/)|3i hhL%ҕ9.Pp~X19iG8iO  g,9c>X!̭b"Gg x" ĸ+sFӎqFΟfk{-ZYUY+'~{o=پ `GsUbA?4JpB>18Msyj؛\wlvX]o*ōk#>߄>n1]|x8'NlߤAgIX(`Y@QT!q~ 8C [ 3WDAs=hnEest=8[2xP|f\@k>eOJMZ%jv\j'?GvDF bwc>SXS"[)㖆"+mɀ)\%ZOnt]oopXzC(9$'+/́q8))9]BgSwʿ*E|w?ۋie1U'lgU;`>.|I>0P':$x1W!k1ω%>td@%.[>&uMּ·$y.B0~84 0tzz<\c"lHM\|1]|,\ LQx iP/9h\`Ԁ]q?,GKs^Op;] E`ZKU-@m"}R)bޥP+N:&RZ%t n᪖g@91L)+4O_ Q-SvJR*n>ZJ#}'e&˻)/F@Gۇٰg`κBCzHjj4[sΪZWe]R*{[2F%S/u].i?< *LB\xw~<7rz=ц.4aE-A3 T0bVx.poFyq'$7IL&lh[4FF8)'p2Z~ ܏aL0\Z-8߈l[mLGMK1%EePkit)Jn56X W!"6$۴D{1MğV˳/RTIEո5ln٬^kj")MgU̜GA`i|f^c"X0ƢkZG:+O{߁ ٤#BI>$mc]xGsOJJ(II]` _XLOYYMT |hQ!V2σ5H+iBBzX>hM`"Q,څKixF-g%-y'gA4jar1Pi!,DE!fK#DZ`>IPRh"/)RQJPgi˃E0'"X geDMARB1 DAf:\%1jU?aGZR#; Ig[9ږj{VzEj1>~yFK GdϳNQk2d:M06>"b]QuԎx(M{=+o@TQ꫕R*kL_n̅@@un?@[ž˩8Ղ0cBq ,"3nN1N0+LSCS7}R< aijN0*é5䉎0W3V˩da)s2hasi1댒}.rT,gdpR;P H*U}2Y`DXܝicke+chi)U48/8/S8LiFh-,LL/4ZIn0^:0HFkE@rcUāpSJx q.c39j9|*, 趷 ,ZY'ᖗFQsAR砝 2(*DR˘H0 R@F H(.Ձ&"u)K p]d685 7(k&boI^sv[ǮrqkL0gD0pZ@&G el5V + 5)]ks+65So֩:Ne*d|*5nܑD<"%6DRMz#[dxs7g2erf]n @! z;ݼ+svL"L{&n`U8Kt 9hd鐮T$5PFThx5B0c܋63v;͇F[tBDB\ V$j֪̍>aSFfNF'/bIJ@/2UI/^X"Ze e}=ML3F͘3m1Sxјck8` - Hsicji}^p] \fyqp)tTY'R+|/Pq)_>1nj@fS4] Att|x78Mo P P$Jָ֬bՔ@wRPtfX4T:3,Zh)-4tF#"-n`Њ4ЄFh,@y˾4ZC QL:DAEmdd4YHƕPT: JXȦm)83 $ԣ.'2$5^2nQ߻B[ E/UM+йLzGL:MqMY?].3¬YD.Synq2ġ +p41"A ݣbq(3{SuN `9isua$s]2nzJu 2>6Uw;4p ,DsW sE8>^3/rE <϶c[?inRF$L{B 2`'4Ѡ|/FQ  $`nۋsPy"l3S̤ moęD* ;63h`TFlMԿ}ȲY gl lS.ST3*`F0WTxCp./egAZt- AKcv_Ne^Z R(3Bܺgn jn-tcZ4h&3h2A+j5-lE0 !_BC2,Ģ,D9ΑǤRm-LXY}@ ܖsjGX1 ~nTdWr`ޝL`l4|& w!BHaq0>*O@PgLp9|z..ց5y7mv 1~/FI{% s~:ɸ86k ?=T&ߗU.AzsNR"C)kS^v%Ǻzn{?nwQdÎB8qYv59=)bŀ3X3X}6)ܶeh-~''"BS,>?!n./Dl'crhAZ硊͕{X>EK{9_U:]c}($\*NEiRI((mmea.q;{mOGlKa^a';C=4MCYIshg{1rt)J>5?glv^w e@_}{J/S/'%P;a9dqDuu J[S"2a5*6Vy.`~̐d9/,gI_PKNcAbRY>+:>Ow"?Fݟ]]X *vײCoD|X/}o? v/ wf {,eX6J/oǗ>?ug_kౣ ;.nF!NDgBѽua)d[r^&~ľ^ܧ{F邻ýc:Ŀ`v`mNDǬO,X~# 7'Ppt^.qn9a6玎{W?rzw4*|:[tIM$X.l񁆣` ⑪v{BǧI:0f:b3m0WXxqhSOBBq󒇃կey״-T=gY$OY#]fN؏.FkHp³û]/Een?ӦOpuJӇy;zn>[!p@::Ja6*C?-p}V"F ZҊ1S 9Y^3W|$ZR ѩPzڴ;q,C`5Z'NN(i:Ǘ*x`W Zav(OַS@H-#%) *R`32>JtjÖ<2G[E>NnʌenBٔq+W\bvV: +aP>Kε|ܢ*D[c̤L3`eg.ky,Zn(Je-FRv`P4a.ZuҼŸԾ\GO BjPkTe0m82WmwVy(gELé2X B?UT)F!:71$drBqo\ 3W2*TܱS ¥L 0Z*M`j8"u0-]$`;8zb7vQ;^&&{?`*T 7Ceҵ("eoBE\ F^'{)"Qӵ x@C^8qKy<}RNjIzބd/@] SC0E^7=(e:J(H>jo/{ӕF(^PA2&_(r@; )9 F2d2 cWkAJ]ڟd9P"V 8(JPCU>eo7YD`*2eozkŕ\ RTw2V/{{ٛ % PU*'?$e (Zfd`jzd RsHT!MWR@5PՔKcPC'^7IKt-L%*N1@}"#< ՐѰ OԶM|b秋=|oG[.C>ې5쾮S_(Ht#-ʕ\DΜ̌օHێu''aDcp"(hHT Q#;"*WN7QN'[hVjnM^gG}zİZSՒd I!J99%ziPIBWQN3d^ə9U\ t-T@z̉ Jf44&$mݠi*dSjwm5~OB;XB(F(fb\1d;䷑(9wȞ!N~l;d)䱪GFk(@ xdS!m@ ˬy.X g3 XZ) &\Hg_NK#e(K#`QjeSq1B:%z'i'SO'õYc{KNEU T5l.;<4=l&<d|1ok6k&'qD1E3U)qR8&r䃌h.uZX(L5Vp0F$ΐ`J~kf֌dfv5Ywn+KX48f$ؗY :#cXLfSl֥nҲ}&UMV>Vn5WUv{A4.?pʊP] 6Jor2ZF֍y.>winZ-H$ۄk#]\ !?t=7`&,*?@tҔd.z/ΐAi HĞ;A !kӋ:#8D{vQ)]MoP#a }K%N0#Kd&% &:fIr\p%HΆ_K~F<0*WgYq;;~nT> V({<}UY!gI*LJ[#QȽ2IJ<)M.m:s 2Pu@d@\P)$T.K#E;<_h$4h`hP(# !DF/;why D$?|1N(/X?rxلؾyPiJv`tZvJPZh(υ9nAnQ5 ]A&@Hxc7O%.67Gq⍑/b` "$ h<٢r]*Ţ գw4Qhaټ>PvT\d2dۊVcRdⶢ?|:G p 'q6a0jNčNotlJ7:άƣ!lqH!!dy,08<-1Z5X~'"?&R-4&BIKWM0sf/bBB* Q[0Й~~H'( ! mB'(WH0!\*R+ݤW!#v`e(&7^T\'Z[b׉v]' Y! *V!CR5Bc8CzJ 7 P1;ќ*cvbD "N uCp`UEBzBa#4:ckn# `Vt$/ @!^c^0^Ž^¾'ImWW+_dl0@d,WSen%N)R$ATN1$%<Wp5|YEYIi8rOa+4v~JH8ZE&% ؝-'LcNv7<^nW~`Nn/?lv38?N$H*Iy!6c\>gxQm69 ߲yۇlŗb|I6,+|pK'׋vxȂc .NI5+f >#JP.Pw;C]a-Cy=gGZ& |wA}wPv7 l IapkඇAz@GF‰xFvg=xmav7&yq1we/T򆯆Eϒ?ă!Of?^ ̭-?ϟmc72Y˲n+xÜ[0<>}_? b7 'F@Ji5PIzݠ5ĚH]iYѺ#XB&(]Jxɩ 3#[v[R3or2n Vs`0n}%ٟ~=Q_LnM VJMH AlrNwetA4<(N&'~| *;"*#ldFV)a@#f;|$cLaL)By6H+iyU@{@$Lb7Df`=RK^{+Qϯ|Jl=^\Lcg*,hDѡ`e Ff bbvԧ)GjV\=W(_N50 QjDWc$x.pnr*cUgє`dXl},v?గ[ V)/|AF>NV&GK?3\}_{ɏd2UY_>k#_~$/7w՘MÅz͟MGS`x~)?!71~o-! ןÑά¨ӌCc*Pcj5ϋ:j kCcF,-++̠`"ه;1q^&/?(.Sq[Vl_OcJ/TYj^ȦɛN`e֚N!&u;[ a U-*紦]owlM9\M>{;= l̳d: s_/]ohψ\9.Wr!jQKy!+}.>WxցɇطOG$[8R}Cr@*U_$o,X4>?΍e%<fZoxsMnsyXۯ^}-b9L+q o'Ľl:o'_ B{Vʤk-rcgƦwS [Ňqe4 oʛX=;WMl6cnbPqcKTVŒ,:bVՠJ U項2lR :}:z-PSGU.+Dkrhڄj&Ni]ƪUΧ]J)7L4GNˑpޚ:搸,h掁2RC٤b+-O9\d<>6m]K\KFٯ2ʻ_N&7*|Ppz"xw=ȻݣGR[RWfU8-TEU7]-M XRj`$fk6(&U)_آi [yWu=/f),\9lBg bSDRRHnXt7p][iӾ'P˾ 6Ibzo=-ҩ2Q<=ڪ3XK,̋H1@RAFd"WȻ(ETmK?x8!#ɽs 2Rό$Hϐ+{.-گ{TJ!DٍHXj=&f|4J |9ࣤIIa/[OiKpLOg>UR~i>Z;qG3$JIü@Ϭ5_JV߰:Wrx9=o&o<#np (-Cf>Ϊ)K  {7?~‚rarg?e> +Eur`ZBY͡*:V=/y_=Q<@Y^ks1-Z(ZZa eȓ<\9F&) D !sUfP~ ei&SߞqKq*EwGRDUK^ߎ?}_k>VR``X?{WF ȇ~n$~] +K)yC8-Q,!ꪧe=qP-?u޽U 2|?t]8dz/q>rB c(RP{$BM5l-{e`n߆+JPyYS녭ٺV8>eܻWtR:W%Q./i])n?vlRhj&@*1zH^V"@bprfoX l?U[sƌUj5bڊE)!֔)(4I2Kg^$2ɔ%}p8!m=]3~Oj,ǿ=Iv9!8Llܫ0կmZjOQnFv_/]vkK-`fW-}}--ȮGn%qY/Wc{s)'fˣR0M-oE<38u}~ecR@UT5S@N7Hp.U nӿi4nL7SOt~Sƭþ""kC? /㭸Rt~:u?KQ}Rs~irNlfz`Js|A`f* :8Tu!!1RA;ݒQ4<%7Ycl]$@:]6Flw XK5ܦgCZ&8cZj@]%:}~mĨ>0ɓ>?&= `@M3dxt؍][ YdZ2^nf>߿Y9\b`Vg--jqÁ,#DMUGunIUWvkp6մa Rjk.:1J9H7AP4jc6qYZ#\FeiMhȘ;9/.z6e4H(mI&\6>kqx-M)= 3 &V/&FNhkYO &f&ҔssN<$JqbTkY'btԛi?XanˤOײz[2Lo~簇@|K \MOR^K T/yK 1"%z͆ؔN{ I_KNS>pw#Oc&[ćr%Al:!ծib7JQ"vZ$~M0en΍ s#\F}܄VC/"u!ԧ[]')uŦ.6"1m9{iWd?jg;hQ-#)) t >=mF[`*7ivMp-7W Lx'!ش% Uz$12Y&K'`zj"N`DKJ' DM'f቙($MYr L^]WqijZ#3>A E=7`iX=oWwɛ¥_/f,a:gݮ;A"Z} ̟/1?޶7uLjs @26U[+7{27l1&]0AGL 5cD;=M_8>]#ffBpYCpN Ǥ=~\8b{s\jnKZ^n1O ;q&‰ћ5v#h+=B!i@MT︅W|iӠaU%6Cs$ɭ1*&h !Ik[d`c:Co,/^}Nm'&>fftLiү<Q՘ W$I=|}g; ]ˠ-ֺBs<Uuڶ=1Ig%WuO!u%୧lghgmmu5Yh9zC#`#{6y;GN5-'[OPk^IK!ߒ`׊5ÖA Fڝo= 0#7U&"x| )[Ϋ~0zqy1s'tK\݇E/:_m_-{sw}kڣ!$PcyM{^\_=wW< e0,3%C!GW]Ȫ72ao?^eC}]?h5t(W膶D*şΖ{>]}qw1V '˟hq&fkIźՂ[_{pϖOPś꾿\믯 =^u:SpZ %J+(]X/Sq`5#dsY4_37@1ڷ]j^5x6'e/-6z)%sce' NBAf4빉Kz%_?_ |$Co2ܛ~ソV;00Q󣳋Ș$-[ k)YiT0AsϴT\;OYmDou9|Pfɒ !B'& Zh!$EU! ]Z)~jpRE\XB09De!y/,p \""vUr&-s l0K2p\.Kw: D ZT/?|ڥKhSPьsF*3 s-Τ\Xqk%>B2X0 Yt4!RRS\EY=3aFwENv/3@s1d -]8afCfR m @] J3^\wdv9BB)û !e`4Aje<$XE11% (x= 2 \ ԂsHSV32fV 嘤U^4XtU`R%nfad0 cQ C İ ,bGYj *hVCJrW :%A!hU6vYfQI("ĝ tX5)Ո6M`"BVS*1ppeJXy!X+@!!_7$9\;) `=hAHU5n'7\ 2|Z3[ld P δۣq^_vVhR KO̊rpNF&**J8G @w[ oJ7ֱiRĪ'-K.XB`y1d<:pإv,Lҩ@tћT0h1ըҁƪÕ$< V/uP Aa$BD Vd]ź,+APیh2Ũ娠$ bv_n罹,dk!r[| ,Wz+ֶGaE"@`gE!HƆt"8i5kN'tXjOH:8MY#+Ip RhěFjW9s%^eQGaU 3QJ p(d_5j)jQ3bHu mr^c~ Ni ^&\iM\2PV`c+Z`LRfs`csF]Ǩu$ۻ<ɚ<$BV  r31Grq0wC{ 3T grr7,J$𠗈ś ʡt>fDsC < X%ٹFmQ%\ EcH )0Eфu"L>"Jk8= BN2r5#`5F M V bs}ȓ+fg"XN#! Vq[0ra!7R(!8ȣZ2Wk0g |1[`rOTQeʭũSY]$Ffj FuԨV \#|r$K{gR, >d JSMIAwb{f">r5WlPC{p};**Ykdp`G]`9 #aeVg@|eޟFpJ8nh[`#?QsOYSrQA,6 8N W Z6GِCL *V: b] 1 7&) d$>Dݮ ):vPZ Q&,?t + 3!z# T GVM]<ōjp*(봨O>).oOq6lOI4g20|CHc~-nA%ʓ}w%-翔}뛫7_7h;`uw_/W%ן {֗kjNO0\/7Fn͚1 NmM)ԳfuPnlkC}ϧEiq1f\L=Gy;_5,WH,ۙWvU.kmL1;/CִsJz(bj hI}z(|Zc8z(#롺롺롺롺롺롺롺롺롺롺롺롺롺롺롺롺롺롺롺롺롺롺롺_[e!7%=m/X\/'’P@eCPmkzzzzzzzzzzzzzzzzzzzzzzzz? ixJz(o\k's%례R鮇ꡜhTCu=TCu=TCu=TCu=TCu=TCu=TCu=TCu=TCu=TCu=TCu=TCu=TCu=TCu=TCu=TCu=TCu=TCu=TCu=TCu=TCu=TCu=TCu=TCP ` 7mm 2(O%Yo\Ep-g⧲2nj -gWi h+`nQ=3AL,Y {,S1$UNef|~[~.cTu\OfO-ﯯ\i=9W;b>幘{`!b"S XK婬 ?_\A@N6/^\p|cǷwjqѴ #D@feFN!hGkX&,/* ֿW'%YEN&<+PQO.(夋JVĪ)L m86Ծ#5G[k 1_̖?oo=Vhf;kuⲤOM]ٯ˻lCZ/͈_OXWC!å^}{M˥r֑ *Y8tqO߷?& Jh;.w{gXF%64L,+eG~%i&/ϲVy*)" O$`G#eTލ_,؍ rl<75/5X& tSO#WVQc>@+KʍSRʉKa B,ԸbkXx?LLef7"6[vGDL{3ʍ K XMA7ቀe#o*;l+1.>a/Xz;N S1[Q=3+?wz"`eSDlZNP6d*`6z'25ژجYm⺼c'XfK]Xw`FNefASY`#!5H-2H?R0[#yO|&'qkV83=rZ?Xn"y"`(9j5ƳtYzXoQw5X +7K~[?Xi˩UVʑ|[1ڲ3k~*`E.n&3`]n;n_պhykdo{} ^3?{,I@֌0NŪE~ ' o/oЮ2%2,rOM7ʫ;Y-''kL_l ^W6?\%ܭb `)9TPr8Ǐm=id3l~=m~ojd'|O <ܨOg,u}6SuZSW켬Ұx?gow[CK1 r\Y>Q@qV'CA=7^kw֠V>o-{Iztv!Í^ ʉE &zD~ q-_n/1pu\-`,Qnж(ɢm<| !Wov\]qeWιi?|agz1-)>X{y͂%Vrg/0Ň;iVX^Wp?(c]<ļ?aZvp>jY{8pn>FH| 71@F%HvեԮ2N[\2ѷnfPTո@&u E3Vw+on\~۶ F@IOdh^A+SЊI{)Nk\h!2Du0W"dZjb2e%[˞ل׿Iެ-CrU&̶Ta5|bmtk("HKN%cBv&mKcs% NȗqT 7s!ado+) ^G:PDu{ŇrZ\_߀RZN9JelՂHXβȤUŲׅ8!X0-saKAFc16֌ Yu$I֚ۜ'Q) ִ3M0f,ry;Lַ-J!ZB ][CTZ( $3C )[!3q.UUȵ:[65IKJm o[f <3-kunB*oY8B"H[o g#} ڸٯuCs؆bo@mU%FBVk$ʸۢ58))(9_Yגۏ6yBQ\:%~\KkMΖl޴&&x5 pRda!>q`j=ԭقd/2DQbjl9d)0ErjfÅ1 y'X 8̒*B$BH%G0I,eħ`46Z=fME[I+@&%X/`tT-5\ ԁs(S ngT8셲G5&ʁc,M4M%.d0)Re`X[U(oXe=7h SX !jpeY33SUq X$cڹZڬA4̹(F\ Za+w%Aq%cT*dTS}:+ ~ XNUg%TJX*1 88V^,.zѮjeaPH/G*Bh"jNk{QVIa$qۭ@^b-bIZ W])a4a7AkmȲEh쇝E&] vf3AP/ښ%GݏA"%%Y2IJ)x}[U($׏Zk-ZesgJ(0_ Bc:Ehw* B D}q"`; V&.dnrU;kk] QAIwzxtu݅H?lS*OCxN"P88K5PzqVDt_*CМU tц#(y@ r\b .ߔ@i$ Pd"yK Cf 3-,~@ (^A$HȪ C.* Q\',:+ @$ϝ"AVw<Hm@5c˂c0a黅%1ݝ-? RYWҸ`Xx_b5= XA,CA%EuS :-Pw@ڪIiiK53Eaaf nnƠeږ:;zB$ zaPH9@;AҼW Lj2XAKm!K8j\PéP 'e#0<α6kDN@Ty5@Pģ {h0lQ&GhaE(`geb.'9&mW/t#lgH{,"1D()P9wQr6{V<.< Y*Br!Z@BS]@(/mT"f !Xv@8xy-7;AhW Qxj&YL[gӰ"V&S8`\B%/9Ff-a6G 7{(hrY:zjkF邔.Zk H`4hfcAGkWKC/TADx (9(F[lzgE:%3ݩCAt %paQ2 #N[diGO Er)==rT}fX  x3_BV$" CaeP'BGuWI+`b BbYPchI ňvwg= } h0#-q>&ZT@:xnW 9j^PEK $TAB\3 0LjPМ[sT(+'kj,vkVy9(U(M?䃨&.q9:ܖB`Qi#Bû(Q((-1EiP!VׅzS"p#ZDbg*^kOQғ*(Tq*IFn |x {@pQiz+E6k"VْӥJG`eCvD@Xp*BЅ8 "!K#HrX`quBP1DDZ(t7B8-gw{qlh{u@.e 8JPm9>NHJf3 P|fQeGuz7hr "SkmŷE^Ŗ".hpb?q?<7UBԉ ┈:ȫ-ވ:W!zUuR:::::::::::::::::::::::+QGHzo~_s&'.aq*2DcTX# D9x7sZ,uQ繈:≤uiupN3jQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'u^Q>)uh t:忨s7u/'Ha &uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQu|PֻC{rF WͶx~3]t=$5_٨>e>]4h2s KR,!l=JE@z"Rωbb?3`Ȗ-a4,g,-i@?n >zwԢ1G}"Qv"*LiqCVAODXs /*I,[̯'%,V0(a}򚸭n:z5n}z/:!t>ЅŦ~Zۦ2D$\{n&puWa1G2obïr2`&7kONll[`,ww}[}kV)n>f-_?\-fɴE9rlћ3#26&aibl2嗇WP.cjp&Vq]nf1!2SogP793EzhV"|6{U42[Gi=nݥCu:>tꗢ/ޯwxg~x(%c E  q?"%ͨ0ƿV+^49Iw;U|&R.Rj3fȴe4w1edy4Îdd<^8k1LK21(~ !f_x*cLoެ0bRu 2oKC~7/e`e\3˭F8\)F UJBvDة#']t'iq><%}W84v1unc_|6[!l "<Z'LWn17{?Ffm'@I遶 q2q},pM]< ?9:-&I5e:iv̆ӧ 0sjftt>|U9xvJO$ߟu@jIYhY-M?R;Zz.q>MPI)^'!|ZVx.^\|tR h.(U5J> pyf:\:xqͿ/ ^oNTsrYiTX3ɬ`ӂ4I܇ٚ]"|Oq%\ռ?NѠ떻oo {o\?d߭zHvZ7:g58^\nGoO?X:l޹] }^^'6 Z[GOucm$5Yrf-LqȢΩa~5P{w-|Tϖ鸂`f2:g_fwFc0Ȋ>j#},,nѡotqޒ&#׍7K87 ,ULVyMn6ь4v;s)lrs_6xY\`ݙ._ΪE딿߿N'MUͼ߆}?bM;do7ӳ$Y>0_lKp szW2UُB~Ɵ/xx!?BN R!г )2Qȹ mc3֐c^T^#mox (^mxѰ0nZTrHqd>EUYmH,7ͪ3?[ShmAۢXkv棣E40Njc꓅/2nw_gYCZ٫ 8&ZIQJ I9-?N}莛”$4y2i}N)L9G!xfWqEsKڽ>R76?O i$..bdXn@`ծ;7aD5[`O;egn1_1UCw ;Vrgvly310CWi'DŽ:* qqcBmqѧ7=l?dUK,R{MQxtv|g;H!\-•_wH{hX=`L=-I|mZ==teŶi= Zy|1seqO՘&bz{^JMm;BVBNz*ZjuPܔ&P~W(J+9oO'=̍~~96k󻁷[Cω[{=w-7vKE`*4Г/nvu56ϣ7 /l'D?kY?^PH-Ց2M0w|?z>5ܶgL; _dP߅-eTf1'xfP)R"m4!Ot{u+Lu2|Ԍ2gJSdD2vdeB<[( IjtCj);*U0T< 2)i6g=ks6w'jjo/NmmJfmr)medÎ'~ Q(Yn!`h44qޕ=NcmF}^cbs(J $DZLE TN dy^m͕ 4Ш;G];0zϰ4# P Iް֗ <Èj>M'Z4ぱPgI2L˚$Áwdm ! =8X| OXuGxXՒd Un2+ Ƹ9@[i;umޑK/6[Wt0Vv-c'j2 9GdQb$0\PEW$*'b1 'O' q|Ğچ 9cycmTw Su7Gqjf) "c5W)S*Y, 갽;(AZ(GIQ}^Fw?^:F>/`hhkh\u iQԉadOV>V'8Ͼ O*pYM"cQm1#KVAaS QByԐ tQW8Kܒk1jŁwdR=~6DI1 edGݟ4؏ГӤ=$O_o'b:  RHҦpHTB3q'y4Q(,1pztHa<ÈTtc՘T;|?y ={0X;~ٍ7k2Y-UJ a=²J'Y"iri mMkxÏlM|1eQM0Ytu Yx% ӤF岅QwBw\2}&"ꖴDl)N|2wKQiI{nI{/I?42ҽ&-25v0qB7ӖlF]SȘ#8*.IdFR\ /1RP䖥ZtWpn4W8] ZA7-S2A{:i1MFm/SVrmk0)K-M))D6% ERwJ8K䤁Ǝ5;K>ll#Lf0GQF) 0(Uv-R[5`IfJC5#u?U4 y 4i^qfqkg\y$„@'B P/au'Gޕ=^p EY KZye͌v37F xGJ)mgJQDN!*{L/c+R*ks#i4&(4&#n逹Vjp|i*, j| .K' 4*Q(,J gnvj{௎?(+oa-Ƥ# .lz-7$Ca,̆hkC =xNxkgmI4>#(p!qI u"_N6&X2"wa^U 41@brؐ2 )m4٠S(N G9rsr,wF aجJ:YL eT?,yݛ?ա7Tφanaϙ0ʇ-y3u@c1ǔS5O;qBySޱwPd53L9l’6!RqSG2@pv`:g_J!PcPxP EMK7 L "LJ:8YdɈ Ov~=m9ilC _ 2j+;iS)]ibGGN‰y1DP` xGS? &3{Ox,F CrmA k5'¥o3;6cZ،f%F3wq2N) SM$> ,˔ \eϘiT(LU|L6|g9f\-Sr?u@\~;.G1Ӗ-  CaeS`)Vn uxQFپ=>'yƱ̬E"SdHG&͙˄*o7 ʖb>MѴ]xWZ3&t&"pFU ?u`o% PD8f%. e4Ш4jFrz󝛹P9jaj[-e]vRqH$_^!H`n@g[_$axwQTD*r@ſ7OGY*-("9Xpj+|Υ/oy@}zl‡Eܓ$ί_}~7>sKЊ(BeIy3@覓O7l6Sx pد[:?mMraf͛=NyJ}͕ϏPlͥ;X'0۪6yWI<\6̒ŭ-܁Ir,Vyu˱l8k}3W/O:r^\qm-e ;敨=^T~Z۾;m-l}:n4BrWl--uMջ/xΞg;!WMKlǗ|XNeh.1;_GڸVlcn6ZAl1y?0sds"dn:'!?&\p`rz xIJCq|Dq}Ij)W Z;F|4Y!u\)3Pɪ?@*XeI`hKxIA{bz/ΥuW QL)_[d|UEP;FsE\2aEG@hT=G]GwdR^2}4ϰ7?)qA9p3O{׼3[C1*QNK_}m0b븼Hiu]2v׏zj~iqۃh1QMas<\"5a<>ݒ cbh_ .'sE3VwS;[@"t߽[o췷;c^2VaY6T 6Xф1<t2f}cż._l-`c$ nv zhm-U%+|3 »Iom4:|inc-KJ7-@&f^̖E$_~>)u)l0+8_,aWlog.e1yUW7a+|G'# sC%v\.^IT ^ZK؆nGQ˵?~0/etdaQ !n/Ҭ\9ݍ7єᴘ98h;~[*)x/.j0F xwQ#>&dÉ)ۥ_W`G"FΌ-a92 Y^a] i~J=ڌXmcf8*hqbp|37 PۓCnJr=\ԩHD??|ܽqf44wDQzoXdDzo|=* u¢_VJ}OuzII^F}t\SPMyJJXD u䧛 LQULzLx7g:k'|1ꞹ(|1־S+(ד66(V,3r> 뽢n00YD2Ko[SDyk4q"!R>b7D]6Шf9G}܌0Q=* gc)c;FU%:wiXO7\a(Ȳ`L(ssz;rVҶESN q! xuyjc=~ {Q1?煋ASFkrq*ѶSN>Z bG4^P `9L^\g&}QE"p#mQYnzQc5#. Q^!ũ+[)RSxhǕ1↌cnSpWPE+qK\@rьLu* #w^ސғAyUiXJltWPM]ܬ 8 8ǜU(%VqI\D*&; AӇ4 5HF|69V=3)BF |VYFs_w-~`6_hUƈ n`,+ZQNh8Oޕ4#x9aLbGGԡcsW[eYrI2~@R\$GRj.\I$\0b=~ HqciБ$fH1p(/IbGP9/UʅT>&u0H;w "D)jH4s4j) e>#\& # )Su\䣳W=B>qQDlaHDC.ЀYhEz@Di!+gl5&Vy,I$ Sq'Aʵ|Sbȷȧ֫QlUe}8ZQw{?l/uɅp_r(Lc!?߾gm.+ά=fPmf+-_SJڬ08|6dvR'AǫWzơ?(` t5sF{kpoMlXGȌ5ҥl_hObDp“Dc%,+ߣJ'Ѷ AIc9Q #;|5nTձwռŦ: th#HxmåM\lgܝX)e@I  C(:10Gh-vOc )}g=ϧ:e jqG:Rmr$Tg'a"jaBĽk*0A;Hwa k;:^(Wy'/ :ͷ&Z&q{iVr T 2ns'"7pt\iJNH8i剌H !^|KɄ2Di3tX?Ll%2JғFO"ĨRNc =%ZR;?XBRmUS?CMfH.8ݍw)+]_ui,qB]{R<܍5d"IlB!Mu"%Li,sڜ]nm2 G%&i04"+N1fo,x h_pi͟9SUEhC:籄 HqLgá*rkVҮnhy[5!㷛}9t'uݹ\" u&WNgD'i!҂u}S>!!ī ʣkWHJ|<.?WP;ύ"Ly+l4 C!7q03GY  hHV!7I44"xԌ&BY-$hĩ;׈<$47RИ;ô;@SWݬ`:J< B>q-6y[|SŞ& 9MHjɑ1ZKO]rCݝ/ָ͌9ɢ 7φmzump Nybdަ{Oo41IDa~Q$diD$nĐfjUZO]U9yK&P ѭhuRc".xPjz O\f+jH\^[>ŸUuzB̏u6g;FlDScr=~m8߷ g|Ru΀'.3y/ B=Hxvq!ƙ=Փ]OB( !.[U5sn |(hk H㡥pbWcc_E'(ʳAh/ѵģ~c%7r$)Р{KllWf/j!G NYȧ./WfgNsVi"I5It䌄Q"H ݘkmCSWۨU=8zoFa)u/-,t1X&) j^yi@##;35~_oL Vzc1D>u[t;kW0s|'(>b=uəXcb%o$VQ]SiV%Ee50J0O1nћlqgwA D^Xq卽ٜO!o?<ÚWj6g|{6 X! ҳ'kR$&~hE_oڊh[1%e9n4c#-VkVbƘ݊u>\@xWCׅ9|K]dW+.rF{%{H@4w9ݭyB+Yŕ B=Ȅ= G01 "j>CAFskwW$Gmȧ.ec{JX}9B5")wV,̌I~g:g NJ Q*CNaWB6s:N0΀qUgH͗4@BVu.I;O]MϺG] -0O2b=OSX O& *|.lVʝ-4PSD[SIK%n''.93f❜$|q&` Jo]b@"o_x%e,R7z֨1uc#/QIL)H44Gmlc |K.x1gˊْ;*ۗ0 WO2{UV"y%Kc_YO<c!.#S|Zm8!2v߈  lW%ʶ$n߷c6 mfy͢6d!چ D>(`CeE7@}Ψ*-`w !E9c+obUX3211ŏp:~Y< )=bp|1XPWht!!i,1tt : Bh%$E_-E=48 C-+)_5-DY`d^:PBNBCF*ӊsE*u%p_$Q[c"}Frh['i"jiBpbO1D hI]=3~3!F%ZNc:=~hI]wоϯA F%ZNc JѰk$.Qq0;g)ļMyלj+B-ݲytM1RћD%fLE25IvZvq쮧pKJݵՀ1 Y>쎯4#xaIe}v=EΔf-Z1*rKHUXd '1H+,G=7 8gخ[Hy9|eUb*ȕťCwp1 cFbN(@HtcqjtդG'$7Ab'2_{;ԭ6χb 7؎<ϥZ otW[[E{YׯV>xQXB@QZNý~e̡~-lɯ@8}ގE'16i)?(/I%#g4cl+-l^Ȩ(5Ҭ;٩|Ϲ=_3$ɻ]Ѭ)cʤa9O7SB=c4hHJ 'â7b'd䇽-FF7nMt|8IJsXr1#̝9}j;}y&}EVWgZ8ny|~o9{dIM|hh"$kа`U5Jq/f4@k[4Ppim$QF{OУӂJl(0nQC{Sj֎VNɃFԝZHȈ B|Ӷ;.sj6k6ݪWXKpUU v=%\PfcAObPA{HTUi]tឥyM^^M. 2& D#:x/bϕ 3 $wKf Om DyJLu(ΣX.3Q̧m+y]&t#*+%kwBtױ\Hz9~7},"S.Jfब~6Ez `1~M9Bڈ4).)}+C<ՠzqb%I.Rܼ!%^j7rCW|d9ővv݉ז$p/+|CQMp{(B{^:j x`[`:8d;XMi :wflxKQrK؊?_yFaQJb#v3'J5eD3*'nz`oMјtkT!LD9o3$KyvǹL࿋4{uc@M~0&4wx@nLNp l[cYR=C[OKjm5[HTUwYUg Ap߾vX8lxape8Q[ļ@۳ 2O.Ճw :WBׁvoL0RD~ =YOz6[zkan%_!SpA'^;4YNbfKSn)_a4($S OA5tЎ9J8iqH{8cXM5pF^XNebͼ(-g%؊W}SP=w1~)x U4J6ٱͅ/8w,T{b져nhU[U줫\`)'Gn,-'9"m+^ 1r5I$Oaeۗ xiw雡%KIĈeҲ!6/Y2Q__7)$wMs߼\U-cx/4Sgod阿 R>R v%kW=H#MtpC2>k ϐDe\,=A z>sS0Nf@ܴJY|{٣\oM}X# I_|v<;ZK$q4A1l^q> @8l)`*??}g;Z~ IORfG|BF(>ֹiIm6QlldL9/e-6xXu=ph%H)YMMIRo5$eԀ*&{M9m,5PL^ i 1R0'L"fJe{(0I)ꅒm*oz snM[.tUQ&n]}8GVI?_E ʔW I{on2CDl I:7"$isJ{"58T:PNpV+;|[^|k=8U-,zL+Z|I1Enf:t~Y>Y!h @1J 9Np 1Gޱ6_WUm,)S1r=ԙ-[LBOqT6xKI/u}kn ^ (|V*Z &-~caߞ=F{AԔC.DFRMcUSRǫoߕ@;`YJRm.2soIvjTM: 7`/Y1fҤUl&O d/#R, +-1R?K/8shnQ5 ͅ;V Gd&Ŵ Ŏ6F[ΡpAF )}CǮyq擻Y^X. tHyj  Ļ^ʻ'6hSo1HzZmaar! SVmaRt^:)Gִzj0a`NcT0 yHv`Zl`[ ^TX?볝0ۙy2`zA9x5M+9ģgf>IBştlPoƩ{mq)K~]5ǫo JYsx"6&kq)&=^<,>{3(쿤$4d)a:pV'v~:RE{6ߤj5MPܭskI-(6,`!'re`BJ|8{qW/yoäM<<>Ce_rU ͊  oD&p'HK|yq ů3& 8XdJ@X kٯQVm2ip>cU_[*$WP0Qx+ sn }3]Dy ++}㬿6Ǽ| a[*Tr8gU<u)zޓ Ѥa<`]T.vĻ7<5}ƱKQ/xf js1Or[Hڿ;nbK|(Z~6_cPWlS?X8EZ|Z/V+0N͇h Ssio&?ƣ]dl>?nn8G U]]s=ଛGeya~-]4*` .5|tQ7iwxO-t(*{@0)gX +h~ѽ[?H{c:D(+?LU,w'= yAEb:xC ֖%ġkOᓺlNyOÕ^tcp +[q8$mB0܁c^XDRtDݛ7UE Vv6 7p BZđ8w+'O7sX1DĜẠD̔P;%hfp^;PKj,Tҧ۲17y^ˁ.]PҀbrItT]ǒdxfrгW<=(w?q8^ qKR*|x55=R6pUeS𝲸q4Qc``hWp8 [ގ~\~I"O,\$]ۯnݼۢ@'N Utcb+2a45+3nE>.``1`r8FQ-9ShTzg4ޔ~_ZZGŰ4U|']qZ|yeսÏqo&:. ɬ]sR vz^E WI>Wݻ \I]X<񒐒!Hsg#"ođ8wJȷN 82'Lxb&d:m U;-V(_rnQ@al^T#dGh8oQǛTQuEfU7QMF fh@%3$L ]g|Pr8xfDN.(g3%Q8x$Zխu_:)&}4?1:SW@3~IQ/IŇ}Uzt$$a|?zyO'-n~8m.-ٓHp%O%=dӱגujiJPZ Z`X唋ߨ$5Xkw1ɨߗ-u4L!1?WDf8=[MST^k:$OOu[2C׻GftW#iTp_PڻQm]wYz|f,=gC ;312oM?+=ko8Eo2@cnvXLOcm7IuqlWm9t,q3##QUdU"w9JHрcnِ =,4i!#XO:_׻LĎ $5&P"aɸGiRn4<CP=S&Ț+6t_P] -DހVt[VB3KCoF1io't<ftC [r`D_.Bԋ~ݨ &e<߄cPYv ex8f?ySTSL#/vp si-qHrrCA>}[8xw7š'`"58-(1A=1|ȾP!(QӂzH4FS{b`'\d> SDVv->D.6(&W; l1qZ_4qGلcpP~maa6Tx_ow0tL[E"gVs=i*W<P2 8%k!;֮A~vRd~!T((LGi?.d p:e-hoq? ڃ{s?><-"Wڏy+YC*dЇ% IR~H,h&Lslz)~Jp=\@~C~ G >Arҧ'Z(SkNJAzЎ)Zp#7+S#fӥ(zX>IB`Tz8pRB-HLw0 *ۏ*[ Hk A ˈ~@#4Rfu)ӎbfX[P\\I2PܼjGs [ک|" : >Oa޺I'K[ETGġ ))_& YkzJtzhRSPUMhXq ]FXLF$hi*LptL2VA}pt\C$@ /o~n$E)MuLREgw٧e0GSs4q(GnџW9JҙIeI!i|;,u=ַ;G]>s8pߝ]~yPa.wv.E4ۅnnR6{G~~qMtMdu9o:kV0N30u\ _mINja//@ߏsP_g*zYZħ %1!OI+,ƉR:S*<~UC@TgUgQCT栫GAy Ƃ<b0@?~n^Essrn5 okĒ䦀Eh]; yE>kMYq23ִDۦDh(̀Rz]P)9A| ])5"Č},%s+VskA{up"XNGU/aY.8\i630,LM[O7RҬ84㮴/ǜ}AsW:p3Ϯ^C55m<2@,uI]ʗ;g`%OH#1?3ŵ?ЙSQoMNeθM[aX(DO cPm%umfe2`/d4n2WE$Q yd{g K@o KH&L>ҧHS]] ߏea./|'ݹ_, s[,| =qKݍܢkk-|CðXPJ_+?3t!\$0f8xSY9(8{bN|JLyIJD%(] `WGP&z ҫ@YI8ǔ$6VH)L57oU uΘμ !z< ~q&;~[lB_]lL;zİ]\ipN;Ye"S )E$FB:Z%2D2tjEb))Tyyw/<}Yo#ˇLG\#[ݓɖo@:cGpx8kc$>2Y0GSi "2 Hӆ7ncG-_'Apf~[yΖ|rH`._l^3ő5.Efm2 3^ڌ si/Rl2˨7N4BLĦ\6 ]./+Boo[|-/j$~ pP5Z$J>j`.. )V`eEb"UrT̸,1Bu>1.Iϴ2LU_[^ f&4['nci=9$J.|c qg~"65:8Ic@T-5859phyivi]δaL  Dn5sEUuG`Ã'U]<+) ./`^J^86@˙?NF7z7 ed]夸@X'/T %J ȂGD^zO͑36pvw!iY@C18c4SvO1LI=R,Vc2أݗ!8 j^iaH;M``岳;<=A^118jMD$4R%%Hl ˤJYHK&*"a֩Lw6/{73uۺ"6L)ݕ攂nN]4Ȼ2."/=_{ɯto7Gu.ߛ/oS3FFGE6^ׯog9`U‹<i-<$Ggzwxi8K|<'C=r }Y2*тGp4ΊҔ{W$vx&rH8S(S6uh>Ɇ4CO'Є9)NuHC,.#HXN @G,8I_E(d$u~88/P(FdLMB{=[R̋p:`~< c7AWc{m2q*>[H`)g8VEcJn ̏Qvܾ=7هڝ<2z8bf97Rb%ϑ8VEJˍNu pS{O*g2b՛1V~dwEԢbM3Lr[۽8T~ly=}%f^a^nG&ľ@+Џ`~{VA΢eDmI-{=FeFpU:L% =^Ւ\`L6jO? \f^vFktF`Ӷ}8䞇<݉`O0>O{mqbF`xȝL``0mE16* s]VaA{g3J_OK:dmz%@<Ą79^mݘSDLt` P?bvtA,"tWPFל58NdE{H!ۺ]>tu<iS+@Q ѷVv&-edxHHvU|فVd !WQBĂݲgM";Q*lؽ_ߧ lDz4l"r@){|Le/=C8jp㫍m${FDF87v|34wZ^"ourx]Om?(Sy;hcRmvv̌rן+46Eq3KP9Zup>ls ws_k Ê6y@އZˎZs#bm^K1lmjα-E /Ղ{7@[:}}6 RP˜Ko:[TE==n[OV Bb_'Bp8ʃ"Cܺ?xKa.j{Mַ꬯hqX-jn4A`VL(#./(g)@$M@MyjPNU RM =KnİTxȐLX#Um6_ocނvA: ~oG1IQ9z`(Ւ$ Zц *z#ǴYZ2Ғ*PEL0QjR"d\fEʳ+V5%5Hc# JؾM*."HHcR"5W:pV*&|bӄ ~ DڀHZJ+aL(ИgvN<<)PH'#EJs&olBy%E-SoE٢O1F1dRf@k#Ɉxhn&H%G>ϔ)Fʂ50F̑>!IiDjp{iY-nt} ,]80{Րu+1!i@00ʖ1*c+)Ȓi?\@ށ/UMS(C۹ i?peRbsE;AL:FֺJhEvPbNS H[]EVr,(e!@"# ƄEarq m:J \!`"H(]t y"f!g l]Ad(D۬h387jɎAW+vlЁs~ 6.hNA` ِ$6RdW  N70Qbc1H8Nxg$kѴpjNM]o:\ª]h2Dc%KUq{hvn&Mb8Le"Pj,L"If %mGgPZ)y;vв:,8!.8_E֊N"歏S+ N Fc'(MN<(2"x Ʀ;2:޽s @ScfIQilwes 5w _E`;)X~X4ZE!J3(҄!&8!҄|Q`Vmxߥٹ}XN)+5SgXhaG谣Vx&mxG+Znk'إ{"G^!Z-&N$dZgn.V$IXИNe^iJyфo-KyD\EFxKNbk#=$ 4HtL82*@QӉ5Ƞmxŕ%"]n@=KmzŏO 떒+nd%՗Gul]h`쀃.WVp{PGYdžb\:J"J&.`]Lt^]@mm 1&=LiFANm*ݨ2A̒|lceO1['G=E2Qix˱,)O(]{ euS[;no^Ze>r%œPA.z@Pu;YHApI k~r[[K_eu禤vǣ[_w!ȥ#?w}Yfcwu6\ѿG7 anLXyD9poc p-|, g4/lPyUANOBi1TH2Թuw,wk^<: !B$+A3kCx""RCʍڀ0F'H"ClMDx6)KaSSVAɘS޻?̟

G#OÕ0M;I. >na1UWA]_>fvtwųmzu@/[LUDoѳJm_6ӝq©=:+} /աr-SգNgK_y {cO:R: o?)ɠ$~Vݹ,V G jt7((;0 Ju O_;pDZ3e0wlt(ߎg;YB [!(RmcODXj"2k*- `:gT%?/@B{:im҅I]c34̺'w{ktDBb77Ҝ}?8:/ ` Aw4fy"6 N` d bu;p>mv]v58{C#ҳNvv| ,.yb&θ+Eg*/]fg^O_\;h>Q꿂ο{u.^"c&'vu 4},00ceij4Nƛ&(9x @j:fvM`D - r2;ԳZXjvOo;MtQ n}s;_;C7ыfPhva15jlP acA5[#t,ϾM 4Ai4Lh^Z`pF9HcE>M-OE㾶7:x l±+Xi,i4.Ě@D hZ jDhjC#'VcsJ LjEV=qwk$],";>X+?nI0̴)jv"%/z:$>o⌏f1(Q p[άR-L+)l"7?j%Pw6JxSzie>oXǽ>?,F7~nm: u˧GAg8ԟ5II6&̲{1D׮PBuv^cLIYLG/0u: eCYPa(0u: RyxQB`lmC6%{$v~ 4ulyN=Wmw"#sxնJ^voD.簶]}sZ=NQN^c~8\lH/&Ȱʊ/|0=TtG:UleQJ9cI]^"5Wpc6TffyF)ϱ5t9s34#*:'/T(YUQ= #a}|K`i]~Q5M$D}He:( ʣgaĤ` W$Bp~E2D})Woj kܿ%GS1:PebߦЄC in"< Y!v=rK<E˥.9Bl.M邽B;}]ʐ I! s_jN/e҆d҇vzqXb@'JSxH*c)J~_Xi]Wd 3G@I@ԕ!i%ܤb = CБ2ߴħGB(-aF_Mi fqu6\i{j |dKK qb8>mbm!JQ`Z4mCj9ǾƑ0lP[]ZhGJKu`b㯧"cQE]TcZY4RZC0#& K=Nfٰ\:vT:vdnnZʹ߶t3-q3MfIun="L# f9~0z*uX-}r'le/14`݃ MyJ0n`o.-|3zw+{98&ԆLߘa yA^EĮ"TJZ͑hobؠZ6z(dIykɝ;y$2;*Jihޡ,ôBi%2WF(˜X*)]"e[X6>V}{]@hY64j4 &5:jxΫFN6;n{1)g1Q-jW_M6^mustFA7)g sUfB+$ߊf")U$ ޝ)BS_ʴѲ\vF=wvSNODcY$EV6hҟ"}K>>T,r ((BJȅԪԪEn=؜%no5:n푸LmGXMz[p?Lݕ~wS/Yd͆sr8rnVqMu^ ^xZZsqgkwȦn#~^=Q[r&^ca -|ե+I_~UUWoWt InRY&1 !GDhg$e:S7t2bqG.=6| آf_\0dϏs8+[g#IyyT:6&~4.xv]]m*- /&u4=lR3q54v1gO\2nX#cm.6@^s bZF`;2<Cq!nabI vT,`%E);&Llsxu``æ`}ƽ`0r=s68yQ}үƧk d.zI?%˳4zA>htUGe?@ܣĕqQy\NARb+S'Qu\W]cRiKz]O2(4o|^\gaB;Gl]wOn_W}v4t0/;^jMkpWEs_IM \zlXrE}؎>8޶эixQ0Vis]ikZ$6#Yի$ȼXR=(Uq5n׸}Qs8".4Un$b-$0`v:W\;yfݵdNgk>Z^wiحrx{|OWm[p@~?ͦ^h눤6X1t6|XN⣵i. #\0Q I`poZ4:SČd:_2|׶s\OX!$;6_7{AP]'IM=ĽA /p8I` TXJ mKr[Sff 42KG(V0|F"PQ5!x M& &k^Eܔ_˛Y6*VF%x ܆q3. ĺ Өթ084<븮99/>OG{we+^4/]|~~jm[BN\{IV4`-%Z㣟>oAEf^wX+:<ХgK%J']fI<[ . #`Y<<[+ vRfFTMg0c9|Ah(fhjZ:JlaF~-JJ-} rCou#[6qS$f%٫D*q$ߋ`m~]dz۬v@1/cVVm nZ/Ž=Ⅴ+&.`3;i'ndazïJH-<$ ]IE>nc ИK R⧐Z<&_bLKVҊMJqZN>=e~JZedB]Fo^N~yŦLŊFŌj|cZ1 0+ħʘǞQnuo:i =Z|承9WiS9M}f^ H1dS !E|xP Q*Id} e#P5 a"C(qr-Y/Z//씉0I9LrOO69nu{ΥoO;3tO>wO>쟌zeU~%)@ӳwlE?߃տ1YNw^v^&>>lLoOjI-.Ü괻{4 bHvS:v=/ؚ7]]v(]6˖V<+!%t5][V*~e`"$u J4YW9ӻ&))&(K,n1 So6z=8N2 SI5Tu~Q@]ư m^W1ɶ}6? uʖmL Lo̪Tfpݽez"tq::=C4WV_ɴp>;65YKon{7Ψ)۾>QJv^s;οL' 7] ݈ԜwqyW qj}ef |g@}x39Uᕆ?/>eo+-╺a6ѢWo MV:nK.5u2kJv-rE/oCF~JbCEձz93hL+_ToBޙEMK}ӑ uu݀6x[ ب@]1'Μ lYW/x"M+f[do]mI[DIem^}? 5W Ͽr1Aۃ7@ʾ$vF)ݓ:rkLѺ67cTDdh7 w 4OLEIWcj. &u/Jpjd>.x1]Q W-g{|'3~2p3;xw 3j?9a LE>̓|C,;eೣ) ae7mX tK.M-0/PSl.7I!JuיdILB=Tg mzNb$:w, sBvfJMV-{3҇SԖ9yU0Dc Q^g x' ?Qy~D7u׽7YQV@%u/q<HƁˉO]H3JIU)PU *oKNӆ="6Eof)x0Zr[|,h*qFdmmTG: )UH)"E,< cPh踌a؀XSB!5/xcB򕀷EaqP 9<$P>aFgcd%gd/ N8xT\iRHoKCt ʕb)GY\)2+ƭ1빠lpU/c6LŹėI\O)0ǘ{N.($Bȓ؋HDBM= 7P+\qniNڪ~)\_?MPC|I h)󴀋Wȫ F *4Ry$L=Y1"< 4 h+B&(F^cc * KY,_j|uB,@c ڢ"*<%A~RSN&9cSE |bR@,pT? J+)Asߗ~bGJR"ԒcFt#TT-oVvRe' wksۂ|ep !L ?yEBb0vS&CfY$y,#$Jp"8zYRl䆐thTmK5b]̖Y8Uǖ%R*| xmZP  +1UE8lJ!Ҡw bTh,k<"uWy͹+D2_\2eQ#`ܗ!Q:,"DRs|dDg H*ܨPDGG{e54QHQ%s}NPsb-1!CtF±C;3¥"V0!YBB BLW|z{Ҹ>%Ǐ| %* !\" &E^4QP_?|xKz_?\f;Q)#^cs#@{Rx4g4d4AJlO?yV{߽yI0|-83eK9fl++ٹi2ӭvt'^tӷ0s]i"9^[\*UJ6P/6TF0~&gly9 taR`gM= GFΦӧ?^vwy7EhҋYJ&OXaWϰd}~6tQy:X㎏YgY2Y(c>L2"qn gsfJMUfNr#s%>/Bq|WHQG @':nrUİk8$Oz "SJ,"DQ_J$~t(cD=G!1mżXɴ[0gc'$ZM0 =6ͺ8Ond<;H[dm>.[?Z9]~}29YNWgV]&?/EA*qQRQ&vvJD QsdшlÊuN=H u?t鴾~t|S+r&OVœ^W,BcƲ2EZ?.Oq۳I[-Nj[fn4p¥ۅIsa~i=Y?>:9KAb/}PM7v;P,fw -%ãiydx&]C5ASA~y礩޳- {%`4i9fKڢ}d6vj#&g#N xYu v4cCmմ p:k+i:aJa%ȌAh] >QyLvd{Hds/.4@0=4 \Jo!&L866J3I  Z`Bԑh0VsT ؎pT.@ 1@[!(P|4Ds59 ˣ_nѰk 8CSs& "B&%]RAN<r=?m|-4jQ$Ar|Fƒu33L֞F(!-߹iVo}8!$0E1)%6cYZͲ6r52΋˿ C9|UPe&s6Eu8Q_[eᣒ<ߒg7O署LcY]+uuMTA 3ͨju77xK@¦/|fJ|A ԓXKE'RR%s)h%quE4&| לא |3Wk \jւ ֏(.ow5ӿ3[lsβn9ΞZ3SL6Zx@˳^MJW Q*iիXVY i&d\.DY5r$[IoE|-m{YŒf{Ըdb2«TחT'SW"[l=iES\,xHer*KLV4Π]yPVIY/Ǟ_rlUfҔhkSU\C}.IlvY};mh㤁ϣ8!ȄoN٬">x?t*s3dVyI&V S7uAE#-bb8mTBVxKG4s(=Dwכ-ìK(.n8 !ll:ѣdۙQ!>q! тz5Ey>a{vۡnq~?uSC%IѨ*N4ɖ+\ g XS`W%n9bAs 5y"S6Gm ^LxZ^U Mch\Cq(J҄ 2*2{^=6OfmtEh3! ^.ӮarIr*'ܨ87f40$7m]Bֲ\ a)n}W,E%aQ,FXP&cKLCi\grs)X$B $"QZKDC'J)$ k5NbPݠ:t srV-eML5:\vj(SQ3Rծw=}/Nls\:"R]|k0Ö'7rfqѺ)%:Vt˦AϓMd zjB3t~5ϰg>;N4y ҬjUVoe2yy͎'EA`i5ѩW}ow}uzݿ$ wB(**ŅK,ԛi@ P1552am(܉=+P5H#I a*T$)  rDKK1T&(eDJ-j =g}*u*͹>{yk3}6-[ĈLv㳡lU/iPP[o& Heɍc8 Fc'ܴu[W꼻a})R4LmGy* r80P dz\]I~ʉFhh'֭eu6w_o34'2(z8 YAQ $3Yy"Lt\Ǝ;{Z͹*=J7@Iz a=AxaKrVxOw6E)Q1Eeƨ?lVGYag7O1Ś4D9e:O)O,#,S(vJMR|"ќ[@ȧ5voFUq?7ݳQ/;և(¦!/|x/adY7-$ N=I;XdC!*DrN0V"8e h5bYCW)6MP-3\sV-]8\&pp I -V'[QeW`CjWGY˗×9n9sg21erh _=1cLgl @3<0jg+I_6hܪҟWUsϾl{fuopsh}m" cڞ_U +ZطIp<$g)M)ƣM f,c)Il,9.*֢QŪ%׷y6xastV0zGVF)Y/ ``'gC&,D *\pO&';}kpwu0n2? z? Cpe=n7ZLw;JSo7J2Y҄]@l"ʥpƠGN:9yU8#8T;B^h-B;eY|T߶p$U`D1\r"*:чP~yNycAP٨pF~V z}T8!܂MA۹#i݃ f89JUsK/\zy vddžgqA[3Z𡜉v6N@u28Zot.Hu[摴C;v(`D;IXh*݌jZk|n1n<\p=bDtNzF; bi8H ^P(Fr@GJ8-$~H< Lze $aDȄ!8yI%h$b> ͊p+&/JQ\@J \(PX1X 17Qb*YGAN1f2LDX :acAu0j8 2*3n}'}6U#Xp)d2f8*F0Kj(gDGЦRSndU,k1+Q 2ؘG=).VhB?)`ccp[EMȃ1a})s*?Um4,+&qB`xsʐU"2ٴۻdH,E3_4{V%QaYjD Ja$o ?k0m8jH!VQ{ 1oJ;(,wC u1dYN ̚#'Ϸq! D(YAp_9Owf 1, $ fb_x*PW:'*4w]Oc6(P%1 w$js&`=qz]k2ۆl8# suޠ#gtH']'oeZ ?4}kQ֏.4BCL60fy"sFBd9%Hȸޜԍ]؞dI yTw]4,=9ސ-HUMMhM{exGڨt̯[SwЉ*bk*ǫxYMx*SZX<Ԃϕ%'14y7*l 'KeKL?hLō%/ZE% $4!!XЊ%.s71Q4?%$a&zʛxJ(EDJvCy=n伞T]a|>.[y4hibZ2r0=]PUC]&8Y)DV3K jIg#2Vk:/EkQOKS`.^,VI3ʍRqHDE_Ј@gT&јk%bJ (I9K $Θp Ղ\eJtq "Y~}_PD>xFwqe +6쾋+ e@WS%Wf$K#,B,h)!]\Y(iWP\qѣeF\qnsqeiuqqe z*qrŕ",ދ+ %_YWR\"ĕ;WvwOVg>qVl 86S^Ne<[\W+Ah 24d*:Nϔ9ea)J!z!RߏCsIO(鎡02!>^X©'6҉MdJz pQLCq?$2P=78WTDp,PeYKnjՏM}?PGI̔f"RJ`@`?Pb2>#2Ph±"b$$㡦 #JURpm]B\$.^_ԍaZy(ngIiw9C'(^{}&~ehеϛ*mn*9D^bE޿Fx\^͏N?_Cx"zuon :.hƹb6Sʂ`b.4o@6o㼴Id-B(1M@[zOS+)R3 %J)Wy1AxA/KvfŦSCg7`[.ܸ9.O % E 迗OF4Ȝ^ϠaiEvЩ}_ ^Ym}Ł׷p~|`;f.s&m7i^{˺L⺻7?ė#_F?;{m=q[ڍ׋~"&6ݞrW7V[8Lݭ6 x${CB9/}2ƅ27p iw>^&ML|o{=-́FUwxϫrClWi|ZN+4BJe_b:y HujDh?:R*\kY+޴:O+jgh*9EN'^49sos="\qw~%74,(+uPEE/ p XX;p NBǢ~|Q!Bm,ot2zWKuF z7R7d#Ch򟠽oo;[vzat$ifK]{ݽ/l=cmqDC{[ow.mnYۯw?YGOreȓC;xkv>|~{~+=BoAuMdv,[U!]Rճ*ĢE~Bqy aK^ Tptfi1BO` <.PISN B>Fj%,FNvv#z fCDn<D@L^ݲsȍ_;(Kc㳗o>>> Ƞc֞e _mȯAN`=+ʟg?Y2yʊ+PmO1#Y35䙯G?l[~=9ኼUhe,?OE/U ʾ^F4Ta}qI0ߌ4)un{,$O;eԢVk&d;E<;KvOA e^I^ ,PŦ} ~qns} }g1y0ۈ)XjF`b+a]gs Ξׁ e7758=n/&A!f"ǚv&sDSKYzTMk<.U7L8HJ5^2JoGXhZFeM!^,~WCwূs:+\_/nGZw]l c̫7( VY=ț| VcY< v,{_M0$&ZgTMv}jCyRfjt>bzai.+F{Eׅe%+Ijz`cg)AR_o0v!>}?|)l>\N k6݃ $찻څ ؅;x`Aw/w޳(,+Zn'"HpP uΒrI3}-gglmScVk`#r3U:{:v.Axz+ |A8Χ߽류twVGv?E4YDV`kLz}`}Bns$E0ݟe<}kĀZuK94d'e۽>SpvKG?% ڃ{>ȸpǠ+}rgT踩wt{.ȿ&R~~Rmy_LCV@ 'Y'm37)-94͐Yڥh iי8E8R3tt^F7'i j ׷FQmqFΞ905ľ'ra9˺7촗"ela Yx`ˮQɆ0>-zK/cl7"hsoues`rr4a%gw}57}0^G$8:ի6jժx͹{/dFZ)S0YMg"A?{X׎G0fۊoZ|zPjkTuZ$2)'BU㹚5gW[ *Y'G k1&@&A$6T D g"") +D&1`︌S?{AE"8aqESs@"(`sR+ Z~\<53Rٝ5dY<:@Ѡ@ƽ!c$, ٳqvD 세T+D"`0| !PD"D> ; B '`N)ExIfXZmJC63Ym@Svۜh٥ f;Fu hY2| nA;Ew tߵ@7]6/oh^=.csuٻ,!=֯?OoN]5mJMUVI.M'TܵZ] nuHed MfƌDl [ht`zUVv5Mcsg *j)G ?i\@c͝Ļ.*j4T߽fhd#jiJ ;LW!nD\('HH"0fG<LjD #QphnKחB[_/;yyytrq|8w7q_Β.ԿVmߜI5*`"V|0gg9,eKoupT=Kg,8}5\Ulfcl>#w(as]֭nrպ춭>boբa1 scb]̡-qs/rL(tM@>&܎qaq"RO?S ׍C\8AHb 1rB#^dlS{s@Qr|YLu2׃ 3kN'͋?ǧGg'4봾uLaVN73 w:hKYKΓ.F4mD9?*xUIr\bX3Q󀛖aHhŎA|=ۨY}U_|I Y&C7\(s&ZpxfUrRRDR2 #r#GJ"EQp/$,tZgJX)4x<:8yvߪ%7rgfcdd\x:.<63a/NǗ4ʋLCյs|BRGfOY Q^IgO/Vpho [2Q52he?,8mJ''6yz~^alx7~#"՝Gƴwvt0R=ڜX"`lc%qHm.cÂy )vJes<G(E2\q(U^w#]I.D:Gë8G8?y̬TRV.[Uj*\+! UTJ yqo~EF_U*o:پF6*vˈ  HRaI#yj>REyۊc@~{B/u'X{P89߲jW5eS480,nh6YlφKH+^ǠΟx>XKyd}5h2r;5E))c!&1κF^j{9~U(:0:*ZKiw<*9ךļA[QyREPjtg#q̖0;&i+q/m"7MhnyxX8]wKܪfQr|_$CXP""@4BP ǧ($17KdLn`1ܱG6UMKL?U_F1͋vln{ :i77%hj5Qn)r^Ļ/v^ڹٽB{>6eaR6d֖XNJ2-Lgɷo+jn:"|=k)U C>~Hu `ʭd/9 93M\^Mrs[Ț((A|c]~t'SS>at DK2sN51Kl66J+6 ks5c9eèmė&A TbγR^ÝNUn[.wIqxo[m0V[UBrsьG3&v]fyp~C24J-K})][7'['iooK۽=y~Ҳ'a~kٶ?YϾgeV3Fe.PtAM|ԭV<ҸHW _dwamI5g:m@:_DZ,T =\zn XY⻾ajP}lG nPU;Cy sFTMgh0ԝr>"X8Jsم~R^EEt&!LtΑCaY^ppa˝Shl^N' ߒ t>m7ț@'v)dA7;X$; .U֤zN|gاupρ}F^32pj5R!IypJ @@\m}T~\kIAJ>~ yM"0)oN48kmXM#[ew@^og_5Dw;/R[WpzG_~i.7B9J|mtGi9NayNN:9hh{02ӵ& kIXM]k%}Ni?/XTNoݽ]7B$(<ΕQI 'RLSD*mp uCkD[itqY ?gd_׹ ޤ35d@zS}HA\$"(U R:,PB<&'aKP䰈xw8|{]fnu`.Ȕ2rE= eڨO9s"t ]X1 (빔;Y!!q#rWqǃMڍ?a7A2( `>[:%շB~E*x 0*%!q CK B#uU>۰"Y߅SA]BN?1;VEHI ">DH"B`\DŽܓ x=Es"aEQjA{c'm줜yǵ9B IF՛9]iX 0P{2(vx i =f"C,#$XWSl$'sFv&[M)] G@l"v ^[O]Rj52۴@ )P?U:j%#лK."L\5*wmn<P0fk| #/JHB FWc)@։%땉 q+Eug^w7A'0-:N~+/0AC@Y`g Su^-gw}lѳFYi6BFǠU ; cA&ī /"=Bۇ*s"eNq"Nj<mTRWљ@舎!3cJ! S.DȥA;'\8IG' yw*Bխ1UPMSkF4I:G3Ȱr޵ wI.'zĖTFٱ_kٜ6A=-}6;7F~?Al 9T 萴9i00>y 6KS9vOUP^b%?|[.e'r<:?voBѱX(}wF dHY=Nx?-d+c ndbd҇D33 8~t%xǦ~fj#܅#I^vq>4> 58b2Im:(bXH Z]LfeEd~< yF5(J(5}+Ϸ) .X`Ea x{RZaWۘFRCVOIoATTK˙,ʺ<6Bi7(le½g=t"m[5juQs9=6EJP]*?E5E}`|A󢚻My>3l/(CQidB.XW=~4Ρw]X|NK6 IQ㋿=kW1ݵޙP;=[Zyd |d=e5GvSxsmц۶)(a۶G>\j5$x{rπXق |йi5xIe.B޶]l,O7LSdqp$:A ˽jPKB:pNzqcG'݂GÇU":.PƄp©Iē\x=qhM:URa}:V < [gC!]u՞aNﴡQ%Z{q*%=FިlN=*yRBh}G1/})L{NĈv!Ă2 d~"Hj9VHt2W߰ . O+}y7׃pzT`nM{(9(WQy[Ӧ557G5G>=OՑw/C zLйe+MǍ|zx̝؁]10 C ;`@L.;]f32RߞQvқd/b~Bn&9m햋5,dogW՘vX ?oHN;kdXb#VG#$KS%JQ*d-ylॵ¬x:,q3ž( 6Fo7%!,v:'2]w"#8x ?x9E rd9 N^EhZV;cؗ{]GqM Y>>>+)0"<*% P0͌Na,X`ApepLIZIr|tx[ K5Xoa(a09QQQQP[i. Jaj(ޢ-WW {9@E@RQD5Ncp0u# *k53۠֟'nf7ALA.k'煠7dxڼʖ;L8/ђ[y:Q^;֪vԵ&}ͬ1lȍKl28ڒ7gw Sai40hT<=lheV6f9!D0ԺO^a慑!|z̻xv+~_+薉kba| +Ox4oZ$%Pst/ע?`^zyl2` a.GF/A~ lg$W&as[#!7oS4,XKHOn,HLkS[]d"&$~Θ ,xDBi3سf;c5p89}-mh *Bc-PGc`sKp'Z^vD\ KP$F:"^Q$wAo!ZꉏI+ӚDpJaV5v}VwJ:NU !9/Vj#x+Dd 6*I$ ;עzڑb2gsR箺cׅ@o~5pp_T0fh9h L*/<(jJ6rDŽ]Ȁ}ET;- ʔpE"F] L IFe;)Xcnj&KF1I*‚4eFryͣ(g)W*}8Z=s!99LaI}WZOI`Xjz_y|o?ߦ4c8|k˃/쭾K3O?|j=j3Vi5΀˕I`e[ =2T^K˙"rі H^]>m[)]"ۛРtfC4_^faR0*`NfWio侊:|O7f45I.IF_#,Ȳoz9J9Y -M/t-`ڰ+ʥ7/8~=_e{\Ù=crbYW/X=S<3S;zx@ˊw^]J-Q*kX2|qU<rE'dֈs,:W"*FhQەk7א7Ir}[58%XXiqe c8*S ;+2=͊ǯAgܸ M[w+҃]Os:]3FFp6u焛Zl2v{}a5o' \r=GͰ fP"p@F p܊{-Q(MOv =0V`Bg}s 35nQo 0=Y\aV?Gܹ]T@.0 %v: ϩ.ai`P5  F2jN+pҧ/lf-$oӫrݺװU4KQSb!"ei%62nu4[mMr`%N(^2 ܖ<Zasn5gg}d)k/6Fo7 ByuYLB&׿.9^45 rd9 N ٽo=+f1Kowrǫȃ3G=D?>>+)0"<*% P0͌Na,X`ApepLI噵)L-6XoaPx^N[h/G.J"J͏rF{qlu8sdMҚe 9,wEK!(|S||W-Qi8f/f~%j}@ߎ50cjw&=$6ொ'$⛟A 5 *km],rpR%AWNmY:dyPuzB5?lȀ$ݭ{8#]=aE3>q-hJ>Z lFMD8NchQ=NQEj&BʘeAZ+Z'#e˞ b$zΙԑ@;T ! | /g*5Vw|슷irR]]X$M;2H\^#)ۓ̼ܳ,׷er0y4Ks\`ԗ'Jq>Q`ĹnC%tR:l Bi*H* 6pCeϹq/위|dƨDɸ%hTKcppH(Kͣ"ګt<),:3e200BDsT&TG fic4hs !2l :[Ύ|V՗WfMũ >/% N(:!8r8kv!b"V= twkA~a&O!w"l (a!'}2(IXjulI4gޗ/HIp$K D3'd4`"Z ,& `@Vc1b5g]jR>^Yt- oGio<$K^5XH= p^kEs[wX$\v5%3y*鎙 `ہ=u$8˓ 0,W*5J老v-8BMgo6=lv-u|-Kܰʭ=D/.5dobmȖ,z\eŔ̀돜I=7g;~bwǙ&vcGP1(, L92bDhP-8={'wQSy:h9}͎Ȍ F:Yb>2l`&YBFg}D[&(NYiJ&dS{vaŧ/%,Zߩr{A4ʻ u uPj%=١ƪ딸ja=M?U#HAЌ? ԃ {Σm9ޣ\R2.tAX"Rzʐ 52^?"mc!Ga̽vpv4LPq &wL2E|uB&Wr>tLYMrZaod-;q ^qD#BSY9,P)j!R$yvw V﬛kdo"&ut=^dyˡ̵@Fc$c9KXVMl̯jQӾ兒 U~JB˺8;CM u+^kp~pn~ekpͯvc;c󎰷5IP>8I HLɤTPn {:oJRR2sC7̦=0cp[H<"GwV8+*}_-zw 2isx/RE]jEUg38]̞} +SB`dh}H/<~=ӌ"f mZ>ꃒBLa࿐Kt&4?}W;=_sJXfTxQ}Y`d{yj $ÝLdw. .K%@&Q^[|c=_ $vFEw0OSz'o*GKԗK.񔅄nVjH(pW&%f ,({R3~kK }FwgjUޡ!k_Uryti瑭hiwAQږG+Mh#M縪viBl|Vծ|mN {SJKm̱,|୍(bVzc߳><,v(r7j 89ڍjzX}@3d@ >EU]J:(zO`]=j5@!0֤ dNj U{bBΙI"))5ZءP(R ڟ*8'f~?5=]Ӷu$ !L2`k&Z2h^z&PZ2X8@\pL F+j{DZ$y)WyfCbO qQ\4B<&o$ e|y hC<>]^Qz.6WFWf|謨2'=Y/jkuш4r>_so>"-VyP^ If#wo19XHQ5RLyԄr׍ߕ.ɄWU4*SL4( }-*!|2Ҁ :iZh؋ݜc]ޑY(U& ٬4QE|@RI1- ,-fM:gkZ״iԴְ(E"JWzeD%Ie'hm&h$GɯOy>|DHxC "J--V<=D%"J<Ab_}śr2|~ 7aMǖ;RyEˀ{r۽ֈK^Fimo IO&#_EP˜ A4 Z KEK~ɽo^5Oc_A_k<9BStaDF>_s/^,ZԽB;8yo|7c#n{Gcf0?-;&3p=o`/QalYכ77+8 U))BnOZpk#*c_pm^C3bv}Yw ݣWYRY)7NMǴGkW S\Р^R[;?֏n>ݸ9] xrmb}it3AU+RkVX t.&.=ʞKPiYA@'}}-fnCݾB wthQKlm?S?_]t-)//,WFCN[󦺏4B DOOEQd:c'S#;:lRd1eVuХ 67ldW<5l{/..ʞnN/ Zzgսþy_F :KճB6iRܸ8wfIhC*Pje}eO{#"G-]},jeNPS /E8:ckO}ܷ.}y| ?cTP)#5 H(:R %2;+}@Zmts >_RUJ "Mֆ̜1JbzBP(a-gpT*bV^⻏`^PhkK3GItH+(`!jμ6J t9[Zj焃Lz+Ђym5fCJJ(P@lYTH9,05Z0o^QYjϖju-gb^|^|am Q:y1w<%O+C`8,);`+(4Q>/2bK)MrPh<׬5zd"sDh)X N$ICi(u -ЭgC2V 0 1g!i*WPhJfbʘ::E (T%mi klkebPh¼խ-RBjK!n`m)PhBZ3+9Xhpu5Z05͙XDDEGʺ9 9凘Ђycj|ʨ(\9K K!80+-f]劒={1bU(ِWPh<3/9rR(F<0ZN2g` -hZ31YI4?_C֮Nj1tђx_! 0_AMk'1Y[J8Ϯ!DF$S&PŀЂy4W>' B0C)ńb5Z0tx^3HU*aTU| n[ yPz8 1zCWPhmXa@flF˦E#N| k63$W_ɓ/$$qeM BgVNe?U*:0 @ޮՐ}Y)V5[FEgDqukߨ1DB^6nR$`ǂ8l)t뮡ЂyL6gd$ZT1El䄢[V9&Z0hs =7EI,* -ꒇBS,CWPhEz3 9#F4G]msǑ+(UžxyQ.!K\mE@ ɺT{ aX;)I$3OOw?3; d1>=i`Iݵyds`04J'u,سüI ]\RR:V8/Haҥܷפ.w qN1w>A}MZBxiT>wNRjLP|e-)'=4i ᵲk+gߤ.tؤ2FUЈA,9,.{oB3ú6-l UtmASҲ'kBsi%ӶʪS@Cƅ/mmB }"l}")(<-}lk8}"@':DW؈|^BW AwΐJʌ kOtpɅڈvV@WCWFXr+l͆.t(Ր #]YhV+̨ʆ.Ut(p+P1rrjU+U;Lڡ[+:աҒ'uc k ]!\s+T]#]1E_BdCWW\)Me]]qt%/ 6l ]!ZNW9ҕ$2#B:dlA@kh+DɆdJ.c KO2p5ͅ T:2F3+L9͆Q+k59EWr^5ٔ-Ȁ(k' t;]gD)*j68 jk̦- @{ Oӛ_u}PR+//i!3uL)G!^;bOo^7r< q5<%|y=[,Jm?/L.2OkYXa$B4A9+hEǿ]^[ܪFVuu2`ᮌ5HGDEY@l᥷E$A0) Xl_FzI[o՛7`%tq=.ź~jenuU]onDž M&}jg+ж$xxX,3ue L]p@IhYiIG_X7SqX\(- fˣvYH>u-R9)Ty,84e@f&qLg8/F34-Xo R=ݏKM@ΜyM\kBh~?$Y>L9{qw:wd7>aπoF+7#X-gý VH^|) ]J GL|i4ʚU]Lm9ӡ)&H#t.j+}]ǷJT> q4S*כf"ض=7;Ht-4?ZTZJԊ_VOn薮rͣw5_= rhbsWMQneBeФnM467( [4ň>1f&T3p2˴\1 F?aQ! ^H"Ġ5.hX5 Ń*`^󷿙~tU?7&@CW_U=zl[zb,x8>RP{$/ME# [n,5v|"g0SwJwT9u@pud]ßiT y!ކ%4K [XRO%o#ij+jc&=jRRQKMvhM=R:T{ΰìa\gDWؚl rr+D+X QΑ ~cv+A!S}+Di@WHW-&IU!G 7~ ayj1rh6Mx7 ea+.\Truj}dau $05ꄮ^I9 ʪ8 7-t~5/DL٪Cz~w/m=BˌA˝h=Wnܨn{e0(Q;6yf=@Mfэ:-Q5pek/IuFާWwZW"KAsjd ԥ!ĽKM|dA3/#*9颱#q޿GTFUt |wiogi/xy=z5wnue;2[ӧ~U ϴH$0HH#1ˠ-giWwwo?RO}BU w pV/n%~Hփ[dSBLRBR,(5RgXRx65Ɉ0gQ@"E?D{ۡ4b3+ )4E?+L.thr Qj]]FyP5gf(<v`X63'lFF݇(i42+,ٌpU6>5+C|7T=$ 0#^iU;Dev(ynN-tu"ElFt5ˆ%]#]I$< +<ąkRΐ$,J4dSiC\e߶] t$ta2uL̟`HWtpE6 =աJPZ:ҕJXl Z ]iW Q֦·Ҍ 3OZ ]ZH Q!:G•<B&Ctute a, [ ]\N͆hEw!Ji:C?'\Es+, GW +P갧`.OjS;ZsR{+o/t%:TփZdCWI7EWV4d3+f9$#B'?R\Ni.th;]!J#:CViSṫֹD%s+I4#l ]!Z4z3+P2#l ljWV4v3+M!<#B& >őOevt$tefDWX| lA@+YKR s+KS,#9md@BWT CR+PPI`*O}Fk;t -'wezAW]V+ӓ_wCw@W tW[bYn#`+gFg}O&(10Xҁhyn^_k~'9v1C?WOeU?+޻#Ϳ:z<|#r,t/D}W|nR,9A2!7hG8DH2^r.wl KƘU;6JoSg.ibnܽ}opV;OYl{fރ)iʛ7__ƖYu󍚽m>lf=#~Dr}~aV_<`lωyΔ{y%>!7 h#=?{>^gWg["Wp[Z\;(_x4qΤ)<:I#Ty+'>Q J8u\Ghyo1k{vތ0_SvKExwpSt4biɔ^nNrJx%uDrAeD(HVSAeyAM$Z) !us"0Ե>9+9<(5ʅCf`c Dq`fTЩ$ X9FXV1WOR@f'`@NJK:%X(3xCmJ} KH2RWzg+ډpkhkavsn !.[*K)C+]JB!2d*HJ&-LTDHi-$sw15}̠ƒ1E!,ePƆ2zA-wB{,1 +XQOCd54B}dW@\u X 1} !MDA+c|Hh*],K`KV 1D0P+"G@3IK8omZN\HL6.P$IˠKZ2 T6 D KJ漹E TKXtRKIH$8Aۤ֐R) Rhy}/K> FY,`;ą3i&~f <;PHۅ*]%`5R4f$r!`!6\&ޯHQC-b'X'(t^ <n޵qdٿBN0ib6afMv isM I9.T!ZVKԊ&@bT{=-Ų '2S ^Hgx UoB@h5"fB4`FYxH!ّC(%G^=(.Xȗj吱N+rLcmiɮYEF ZCUp;#0pZ(Ϥ  *DPf g(y 16b2MC"{:*4ƅU侶3)Ufg19UL. FU`%BvfǠL[$q} \cdP \rC`P*ؒT2j) c[1N-t:K& USbb*2%MBP MGqTP*fT_)2 sYfPPcfgdX.2zY|Rs<+fP57]?0c)(`i핇03uR_7UUN%KUDԭ1 tX,pČ`&s\J)PI1δr*P% ac5.e `I p2X)V%ΨD\ ̑EUWjԑU=%h~EBY۟r@OUDAj}ue"R B^uVMhǰՈ &yYEBb|UT EpPRil-@x@It! /\IZQ3W])4aȠtje[ZdbVGr|b |,5jp0#@ߛ w3l?AfuYvmY UY_wTirB#5uZm@-D&^x1:p00K7` s[ Nz-(] &W$-VUz|񘌒$;`ŗX~aByPr $Pd"e+ 僕EJôL燍Us4. Ab)حLmx(nEf06 G7ΎdA X?`yrx*Nm ɜB]V0N+fdsC <ВM~QD[9jP"˥P4]&d*P(yDA0J$5@OO LPvU߸YúIA>,W؊"D W}\N.BN!Fu/ϘV2Zة !ZYPcx)U,JjňOZY3~ aQ0'#j pNiu)"725HkԬV gF0(I<(2SA"pՔqOk#k$Gw \->ڋ7@KvGQkJ6PX?ǰp(Jf:z {!ֿ?nD5>d8֞&3JOQA,6 8c& KJacvNQ}6VŪR盉0K1!;FfRbJ* aPbɱ.hIM@K(uiU~ޮHx;@3 T GVy]\ōz[}Lno;@8Kr+7ϟ~n V.gn:8J"J7Fo^/0%&7bnHb r6nNa6k;)bzyg_i.t`B/7zq6nZJ)ÿ̃4kkVX٪])Oa:OY 8/iC,bU=D;埒Pr/\t:S8+[O,>B'GP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHɄ:(=zJBlP9!P9JEBQÑ! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP u8 ux[OFUu=vNN<:W,@ZԶ- p%]IF?7egLo7?mFQrPm|+Ȣ,_|>Nv6 Fບvl +d~uX7brȸPu(*"mug羴ni19t(oTbNn9][wot]v*1߁1mqwCuVY`nQ2yy0ܭvN.yێSlouzOq(~+âQsbݏI9ʹ1#zN^7- tlr3QQZ=3??|=y՟{&㨛ocQ&nų7|Yd/'/zK-֧{2cz|4|2M)&&0y_|H~_轶K[gE<0Xz[9RAe6qti@Ⱦi֥1OiNe{m{]v/Э7wCl`*F OKwuyW/sl/[͛+C{~]X_*hLSԊ)˃[l&^{~ӁiywۋsMMXj-#W;ҡO{xky Umk%>u#Q;eo8DxL?uZ_nD/%2JNYj%C28 jd\z Nf?˜R D[Rl:f)8(ˆI2LV$*LfJB&WHR۶yq#+}]`i %@$Y,p`O[gYR$3篚dI&%ڦlEͮ_Uc_C*Wprɯ(kFa0+,.bXp+S I17698H ,X :7^vo?~]7o_?b>~ۇ~ H? ,yBi ݊3O7os7w&[Iv`>U3>3 󁫻h]4 =g~hz4,6>*,U8@{du‚k71z!V ŀSb\c#:u>j7gnVXHo{b0:Ev.WwG-؞sUQtS{>Rݝ,lXP.JUN*]EĀhĀOJ:h.wmCqjfK{}Oծb{2cF1 )l j Ҫs*gw`RӿIGfenٚգ^=[)~<{]2W<({!1M|匵: k޺.-) 6 Z>]`(dJQ6l~:1"W_Ӹ;7Y7 GBnb>4U]HKƥ%y`؅&9tA[gfJvfؔpccaY7 0 6~.ߞ~"P4C-#Q6rW+)3: F[Qރp^Xa_3JtBiZL9 WɋF{蔭4O<@dRleS̬vWrGd=]豯vL6R߬[>8߯UenKڲ6Ԛ=zQ{451I(&ģYK6܎5,#3dN$W,HЎa>RVʵ0+ <*'Te`(CgJp31R~hPi¼<٣@eO+xBx`z5L1 ْZ3/sr|)? k7r’E*L<)qSDc/F/bå G ɪY}@VZkv| (Rh TI4_k2ZM41YlV5q%jۺ^շE! Li5*_<ͬhE[G,YJ擔̤`Ngz׶\N^sװ6|anUNpx8> ịUh4BLBQAsglrD`p'j>he_N&йw;-E1ب2x%K+}]Eg0Jܬַ+hw WⴕTz/편Lq*zǜ_嵲2T5 +40u[ *c7sGq`võ f YàD;K>=YqzOP(JLp&SHbX.4> Oсx* Ğ8{'IqNs#/pF4P4)7-u#`nLg';MʺKtF(E Q,H TEĢDW:3(* %B# 9`Ť%B"D)5 I!Hs!Pd4zNG:;O%enK[`Q+IQwӯ7)#{9lNSS1#;jb'm=֬`-6(EJVr4@#~ij< eHy`1q8*kvo{dnv̈́FBYwWSK1e)Y#h'^m?-[HqNI|mȃ5,ZF2)\JiMYICd>kh 2r)x&c 7 uCJgz3Oՙѹ;&E٤J%H!K ”繱fWٗ?l"!rFa^`ha)1H-hX!Yq:)>x"sT!+oojOVb>=}v;/<':idQ d:\aԡV? dGl8q; /㺈" en㐷R"wZ%&z x$ )[X1cO!4zl5ͭH_S棑x]Yb%a.[=]{ד5n.=rT]kv0O߶j"}l~7ȹzvWU9T1InL>D Դ^aƦr.Sx2d9JyXܦ@ >snIov8]vN.Rwͽ-ZGLY^&5Ke3e6Pf&LG[u:eDDqo4`/o -Uz!v(7GԾal {}DulܕGrL=)f|B*1_VXC7? Z|lݳc؁g&fY7مх\`N &s1B ~gZڂ- m|,C5;˜=HIzciM@N7`k U cJ#68R$#Z{1lNGtd%'k[ ݞ0PjְT$Z Z.jRT&AKEP2]MsE7ł9 ˭.":F *7nCL pyå = ՞4#,RFXp4 Bw@A&2"N%m \}?KN-Lǂe[G^$|FbHa! _I~3*̫74/|ts3 "L‘YGIOJo.fevg 0[6X +e_0O K;ZS-ʚ W=<7]~wasgHjX~9N`.0 [ZʩR,VJ\Ue+_^g(=\ h-JV_ R)E飶ěU`̙@[CU_$][Tuc-Fůf4| _hd0„T'ZK(I| yI)R:"{f>ȅ4* bjv4H &$Z B`# :Hc֜Lci@ QzHLdu4(&G3eJ˔vHJ;i ;QX$'V`Qr>o>H!&1:.\ث> ,!_0eUX4)\cm.ZBA b /gS1G EέRȰ, Ƹ#aREb/"D4`_wdXĊrq:G 7:*҇&cfl)99tb?yV(tq3 V/?FK ;hWU5J@0C\Yxms}_hߵplٖҕ3Z8@}r[&~Hv2k'hkq{JcK_k^kd_&94ua>Bh0 'PU6y]ʯ̧sR$ޞ~sس f 5̖l ӂmȀl%ŃƉGz6[z#9ɕQO ?]Vnzܒ ۜj+p]pjeq*4ZA\ l*4>ȕWVuǕ&fĕ\@4Mmpr-[\ZWұ\cp~eӻ-*p%qUJU9L.Wrs9*Җ•[WnMτzWB.W1Zw\J*ʔ 6W( [WRAʒ 2p5-]pj[w\Jmp2urAuW:j#\Zө;P%7\ `, bW <mprmmbW֑u ʥm ` mpr6uxq*]f7ZA\ *,;\`S5tjq%ͤ@i곐.{h9 tW\lzѢLqUJeK]S˗j/nE \WU̲]p9sʵ,]pjVw\JW6ZA\ v6tuyq*W+t\h+,>Q]pj-B6mp&csq-7;IaЍ?^= 0TT@\@՗<{@eIFFr-T{S Nq=ʌ4 cέ$XVaRn2lFaru>)[.N0)eE0*Iχ>BY'Y4ogF@Ώ蠮27;kTQNJLдC[Aq @߸fMoys[sP?u(m":^z \ݮ4EC67|!LV䞽P#M#K#ߘZ:ܕ!5,ܓ! LI酁o4H˚mZvݫVC񝷆b<4Oaⱻ/)2>`'f9سr9nF*Q> P0NVPi7d]I )lӶl볒: Wrn`+qžYR.{hY\6ɵZ$ʹb%p\UmzF9uF1mprMK\ZnWҵ\ LL' /}{9ˎ]S벺 U jqe1jڎFuǺBܭ;PXW+pUj+,-mprǽZVj%pչ%tZ3'oW_@fA_$x1b lnGŷ۳ô)y#Щn4sÏG9ָ޶@y25و1-j@Iis8"2 i(}e‚6OLwiٖ9];Srx 6` %8fX ŵ4m{4%8U)zg87/egO9v8M6߸YKIO(}ă0yH^VL^};[o+d٤/U1m>Q^4+OQǷ^4֘\`>@3 s^ rotLy;^kq7?6:Gi>N|A44T{Q!x$KƟ Ah͝|ZH7 F_5 F|o:TDLI+H {ppʏ)C{Qs_v6U1uwwF?>-y0~䟷wCvЊRXTe"1+dg~|'̉#3⊜aw=8ދOa>LlGa^wHghrYs?|ntpE8; ևGL7Fr{{S9t}6 H:i]ӧ d&L])'N1O{RJzޤU&F2c"#1:nspX?#j8hcc7bf]d>$ϟ}޳?c~xj]9-43wUHa.LdຖiܶL/]vU[fqܬSK{y29 ?X<(5 Lhd q6x+]g,፳^iRx𽖟dU%`bP˲Q9]HoA~cmsZC$PH0U Vn)+7Ӵ/u#Ͽ)yY؅N$ _!WD7P^hl)򄥘S/"y<%jQ4Ry'qwtrfYӍSp-"9<+yVJ"!GE|Pd0g7Pl B 9תqs>\1S?971xq:} 7Ia}6? !_vIS|O3$qgYT_nKgdMZ>WNQ9tiO]T,Zy$$ޑEk8ڴ{vaUULe)&Yިmrg&ƨ 1?$%,͗jTķz]N(K12Enp?ZBaT af!UA|h?$7"pfA\P*}F.iCg>:zaqbeTQ$ĭUiMV5=euTC ͶdTTM ÏF=8AsrxuvV脙W'G/x[> ~~#7Xe{c mbi7)(&@ej/TF&nyhw%3<b?wڹ|4n Fa]%xIG{>^y&.O6髈+'Y ŗ vЄV~ras¾tnx|ʹUwC5mZ.SUD ôA?,ple{%=жnJr<~b&YI!ݮ LqpY(R ')1FOу)E>ZH(`FOqC(?>zᯝ']xk GG'/v:k$+ w^|N+v_m>Y'O  1k=o۝QH8;X!zy,$fCm#t8 :%#i9!\38Κrz !^OQ]|cJ9ࠫW HT<-NhTɩ,\?4߿=6n5a؁_8]bwMA""y}HU$ڐ턯O'WxH+gb/Lc`jԏs f=#8U(L@##ov6i^pv~; $򊨖?ͧO"?Cgz7O{( Ŀl={sחk& 4!kړ!@sC>~TrIS9w+btJy| vw߽RU\8|YfoŔAN9\1Ty~0<ݴo§Y.Փ Ϊ^^e (~3(/_$?(r n(jE{\ tnǟ v' Y8i0QDCov׍Qv^M LT` !1X``T9"Ujnރ pǗ d[!c8%PͿ{oB6ȿxյY)6nf凓*-oCC# mr#'0o[9 /eѧwE7 ?_$ۍ>^B3AtOD~,^}a@[ljLڲ{s>x`">mP`3I[jqv+NH65bݎ$`5%,QkT$wt?[}pDS(%u+yW/r5PW(K֭7륎0O1>,@Φ1X]vP]x1jY0?40k;[H,7/(0Q# k&no[4ҍq )+Y-m7Ivr}lqt 6VMXPb]›Mv6ivk ^97,(9 Ag;9B2PF}H [IW#YJ\Nng'0N'8\ɦ/frq̻cۅU^|f0ܡIw8RFm*NeMǬ;ͳfBK͇v/gX?=~X~fyqzK'^[˶MAo}|zwZW-y1UP-= %F֜Ld;9ǦXRoW._UQس넠 k5&n=\(TA{Z:}WDXY}ųgvKW;ctDKW9wyY1>}m26Yv~!lN{ƻi\b to?׃,X~ooyWn~T߸y[{ӓms`5A]. nWu^qzVw>780e|^aݻݡA'ÙQvwwkzal-[ϭH|bn϶fp[_>ǜ:e6kt B7K1MO:fP]m̑(#I2P> \BWR!-++%i!6teRU2hW<]bGWHWjDWXRV2pYm2P|3PJj J&0]׆ \BWěNWe})z҄V_ .pfZ Xbp1jMPRatE+eUrU#Е[ʠOW%c;B"I։0Ʋ6teQ2hteP2-+JFt+ʺЕA+ӕA]m!]1IۛN蝢X^9Asw{&Uv&*9OG=lv6zg?P}ce#s~6:kQ̖c,WPa_qPnY^ L|S^-Ђd 0Ce]`7h.ٍFͩ'BdSJ=J Br'9el)u1Hr\0M̚~7,e"uYl ]+GC$w1 m#e{SvH8|F?<{j>3dBFk+ Q`"sW&ɨi:"FӇSV<&D:Y-dx'!U gaqě(,a\DAsI6(5=?;ќ)P 4̘V"4pӈݎg5 %Wje S \;XRUVIt9y1.r04`f>^حӕd$ɈAƚ܊!Dq%W(WIc#|eX)Y(#hN|;f8CDQ2ZX X- J” rap0pT*ktQͩ#Wк:2hW}d#@7-|ѓ: K^Cڃ-.te⍧+2WrGWCWqf>WP \BW+(3gGW[HWivesQ2pEmnUX JGt%5+X'| ׅ ZF6 Jm+urSweЊ+rw+ZPt2ٺ] u5Y"kEP|<+]]-z,CFtjCWDԅ-r PRDwttEKs!l˜ЕA+M+2 lGWCWT8.U2pT+UX Jvttu|;I^Gwcqs^ŒkPx/4(,>w3 ¹H0>Ҡi K(/]IKVl!֕t$_߀,2sbC:6Mp҆+6X6;0kqdYѬM:/;۷Ve+r(!RݐPBk4LY8U4L(IS NS0qoaۼu7m(*9#IBF"2+ q`$QN(&W燖5޴}+ti踻P{J[ݎP\yRTw|b{@wL'eUɡ9B2i0""UTCXI3<.8j9ųI N<9E2F(6*չgְ_;9Ne+97T'_ &}`$%kO²\qI"|7. JvI emE?X]piP P/Il@@h Ǟxr[,:=鋼b~4h"v{C|߳Tñy0jb3[9پZrwL߳Ym3|f 0"\r/.7شMR&4SQٚ`>gP.l.J|q_HFFUKG"?b9l$o00 vPz^K -kh![ӍUz6ZlH٪Mps Wٌ[\ԳL{6/ev\}ߐsbVc+6µ oǠ;6z|`_.Qvk/.tXZvdݮ܏7 o-;|`nSiΤ /Yn =vDڿRj/x`ph(g}Xx_%$Yp߽53{/m#ƺ5? #;??|}dy??4G|sK+r3Bc,|\KlVx RŔZ󑞽R D3::ڧGW!e!0BCf ( /#0d2сs.P_nV!YIC؆D=Pb@ 8!*$LiD _[g'1p Ќ± 1S|ɴ;TKP7sph}wvvU =1cƱv@@ae(^r)Eq⏓NO_UBh4Й   ?{6$Yzspgö[AI.V~`ǖZեꪯJ-P 8`״1Pq[p'|"8f`،sMcl΄ :n mQ\J,Lpf'YǓ3wNNv?7r8Gt<}{a <&,PX`äd>;Nܖ>7Ovw>l7?,Bcaڔ8ۀAu#cϢ̾1._;v\]Mѧɺ|p<>98Zu~w~pVJ`fffc5] wa\P ehXx^Z`Y8*f`t` tw؎f犏GGgpp}rAJX8@\ ah<5Q}9{9Tئcy.x5m j#=5z, +"} ( &#þ?f͎ec("~$Ə~F>,w1>3ܸ&!M/W"XOCG"Ԑ2WqW;h}Q|;uX\VWa;&=^WW$YPƃr_?]) hty% fc_>ʿND+/=x*+ے\ؠ}w9&$*R&+61{ЊiPhyݡ2̛ 4;ۼ ɺ0Zע'E62&5q>&uY/!}Kak-7WxA 7\)^~6` 8}!bp^vǢ dO~<"C;F,j|؍vnN^J Waz+qfly>pAɿo`e 04]'yrВ|>kO۳@MN~-HEKC!¼u4 ÖLGUMSS-< go fzuU,.Uan%oN,h{9say6רDYԝH_(4US Nl 0\*u **"~ad)'!2_ۗsK^ӎΗȔǭ#QÈA͗֘;RF,ن+VUn(R:bm|rq!Wk;YH˪er[ SE}4$zXPe`B پ\Rp(Dv"W0̀dPh_aSFf HvܠtJ*z&ο/4nA[ mzPNg'=<)K:kSG=N2H槇3}iN[a=A0ԠI=[}-j z׍ތ=hdI67>0UphN5-*JLG^'ZcԬNy֖F#leN`sXqwi{slwB[П7OgK J$ĔA~ RJ|1)2c뾚">2^$D_*D2:DЁڡ~RU~sp׍?us3 4V#'޺7t7avtdfOoNI!'<h(쪙0eRi)nw;9H'lM4@0 uYv Fu/DhX!:Um|_Gp{{䨗;'z{;8q>-HƧ?'c}EMykL̡7!=3W ;).%*@\q80_l% rv1뻽<:;mQΚvOr/ oaE-Y𱕴Dx{y}n*wA wmk9ʩ |gȚ $cbWm&tkOڥ,*Ǿ~f~_f.DwU= - U#0u1)i̳Uu դU-Ofoz/w֖LZ =…?sy8^@uK֩X1pb1Wxv@|ݰ8|nR>\,]6LdLNlvSn)5+a%^n} = Pl/??o@&&Jzz)ѷ?HLQm^"K}m}Pr~MoJSRV4?~,xc({ nܡl>U(j RE*W6g7ͬ3>WF9vwdъ[bV;=;~u 0f>Sls 08iwip3\|@CHbkʏ =2ot"Z=v&KJ:0a<`1K[*?TؗY-_fY|RVgd+U΄+jGg|ٜtUT5mrɃe5A^$-˷ A}@[zyic.};,Qa˅kb(<aC~8RQuWW%O]Mʬ sqQ㚗N%tZ5y]xSR1Y)VEUbUX)jrcY}lS{a޾]-j}֍(;W[d3--fkY oi. ѫ·zu+dQpZڼT U+u+zpCw5+` -WIu+++ Wշ"K,]_' v畝Ȯg \Ʋu~B. WWq\!Î6pzpڪriV)t7g1l7k;Y\X-WVW=MMwa \!.pVKݪ•;\'uM[Bv.pܚڪri\B2 Ͷ aG]][Bn-p\nWLUC r߰/e_F:lrʇXa ]du(>Z]6A6xʻe/PU "kWGܵ+dXU+rk۫WȥW^#\٦֟Z}\Bvc] U+ಿ.WixFpSq쮏wr\Bruz8iߵ^Q5?mʉJiG_Qv8T6tS@z]W25MP$UfsnQ݅1=ɴ|8@S g:ψpfLᛶngr4/!nb>zBr[pffnq7|sB [7 \\%v] 3#XDSFq;s'e'sاLX& ^, |+ ~EJ;!nm4\fi \gݳujꂺ .lx\hvh<vKt=;^ZOcBya'-M!t.z%\iea,wt9%lO~wm0}XVMe7+i7 ov|e GqI9MMx:54#LWec,seVY3̡uec 26^`G<81QigEL;8>ԳE͡4E)L.h]ơMy}ZrrC! fn•Oj$,' I@ /B>}ן(l:%i8Ml7\2KymQ:F@SNQ]6|޼N$$b1wS$;u&kdbI}}+ <'B=uĥ %j32)& 8[2x5": \{m2#>ZM3aTD{Gh,nuuxE񲼫R|l>#&Zl+Gh#?{Ƒ.Q2`,Nv\ =,XCD"-_!д4HilGӏǯ(3=*ʰ w~LO#=@%i:=+Pt3CB.#IAJ4, o#Ą<3E4bKIa" nS3@AMu#8)yyTYJٲ-#4|6#1Nb ũwz|$pGVG IN@YG 4S]z;7H%!rj++DU}|4%<~@a)=gxљadMW#cbu~bATr|KW *ʾ4{^-^>\\/_"IJʣ]hUқW+JJՔC2㏵i˻epa0Fr̅~lެ풣ImդN O7@iSOJ4rsOMnifX##42 WbAz>{ɖ!4^li ^e7!yt,|>1ʯ\1;bppKr i U]!9??~,?O|wO7'oQ W`Bu4/C-^XzfB|~ |v~H̍⬿gh6,|4U 6c޸3o\s}e0nu޵muͻvZ6g=[]UmN߿ n:\ZwM~Şw;88lclV ZlNa"a87'Po^𓦍?gq2"#dЖyw܍~/#_Q5= aD QIRl$0O`<ct)kK.$Ǹ3Ƚ3;2f/܃h0L,(by*T3LY 7N7~+UνK~6JT}F;yFiM_ [#STJ/O^m:je8X7[ͼz=kz,/1?xhw OTض&ʨ=u5ީ.bYjv]LuRojZMz[}hE[y@x*_c_uU FZrGЎ*P/GÏⓝ鿿z \8^@$SJHy:mԤhdb1J)@["ha;4R+;3'/4ste|;ڷC>ڳZs4Xn|l{Gq c2KǏFE{)⿚}AewaEgn}eheg^) @&~F-Qxyj $ZNvyi@~ '?Y l@ y?" C<)EZG&EVWq[VHD5ej[+s T:.{ s*:vUnCIL?eܿĤLC*:eYZ02/Zp&#iD j`[VN(>븑nJ`D4W6LmiftfSZLU]8C)1cڛЊѥ|EX)2Do"AejE]M C=mwDZ͜z]> T  eLһ,5Ao:"?Z0퉇ӆ4Utz'#p nOףVh5!D&g{vOv-gDł/ɌFeVs5@yJ`Yp,s$jǽv98X&7Y,cr"D3' ֜ˠCʁ9|7gǣpi.ד>;TlSLm4/S;~RuOTs셌0gmzA!DÉDJwzZm)n _ej }XnCI6$Pu3ZPv><߿hZ;h$!Zr #t^S-(AE)AǛ%ɁS[:h; cC9-V i"`ʇKX2DQRDʦ oU@˖" P-yfBX2% 4'͔I:ĸ)T@d SuPFkVr2IxK\fK +#nm8_=EQ&4oZ?/+-G6)Pl*J2j$Jyb !XIR.Y:K-`DN8۪{jX,~iۤ8 fDFYGs'H0BTgf'4\Yʐ3=W rTY&%CY26lRvޕi*6K[&"̥%F"we>2wMw`[IYHʜ@rxQr%+c{~0L;*"7?g39GO%\!5#oU/7TYiNjTqsꜛ/;|Ͽ.T'Cd%-Ym5lT,x^Lz3d0Gmc-Qap,K?}nT 5KCgqJy:/cLmtƳ|d2L*3υLFQȭϜ.qYu78âE[ NXj޾a螲Cpˍ=6Ss]AS :PbF=|=똸>raditU#H9P@Ќ^v>[12p2ژ!+!%BHu),/!m2d yf b̽ŵT y_7:u0fm]]2WL6T#E\Q՝ :LI#,0wmq$W|~ ;vv^ *5Dr7lDߩaYGKWuWTfĉu"+_^Yg"v^r4@=Fa NV>?Y)SO?^>/s||A;i? ߼;3Wq|_ì5Z%.4ieNhXFWA&)UbY+[Qgkw݆F(ܐ22 q@m_ͩW|YnXۦJ3lFz6aW yg#/O/_϶N0sq8[.0%Еf8W+ٜB{R>g/;S b!r{OtJpDpEp4pE#+?z"+fJ9b<"ǣ+k8"kU|pEVZp •vJyDp1GWd;"k~p+S*[@GIR:g)1.k qo (ct3O"O:]R3һ B M߽-鰼8_>NG 9X[۝Z~awл_]AC럽\ o^ vq:h&Y ^#R;f3u:0ɪXuG4F쭼y`f{ H>N k^S?wk+3S2g~u#io/ؒ2V6&]5YYU.G4lw!IWdºx1IOݾdDuO[=g}ؚkS="DhZnEjU}dLj#mTjcNL׽jQ唍 MB"UJBYe}ʹRpZT{g;[Zfד]kA ZZj 9Z,\ &#JcTJ u2NfGm+G63ՌR֖Q~n›m0YZl&&MjBp- wOn3:`4v7cv[\\T5jXD pzG=[(ԜwQ$EU^܍S5ԺVDC*)IV̶:ѝ(sRBLmV)rNipՐR <5$% !`BDE7fͰ`⃡^.%몵QV nCj!0@ƛԁq@xc#XYd :AJrhRUA 0񘊒'8$;ǍŗX~AyjN$\Pd"U̫ &deZP9yOC 1|Y%P Vx$ۑ o&cQI,TGW>/Az hFyܶe8Tx*Fo7:8?=7*]@,M$C.2T(ʈ>z ^{@#BK|v)}ю߃b:EjC$*KtPK|̡cLFuu ((LEQ{pYf}Jh v% Aܱ- %HWPPl5kh'6!ЫtJ߳yAP΀@ʀG₞k?n,L  3g(fWbe@AJQCEdU5\J`,*&a!dE(3A6 ?FtblUNX)l^ 0n0VIGو$Q5$YIDe(mVӥ*p/GOnr MZ6P*t `F~+%LalmRj6k)Z]sU&jP@0u1vtrhѓХE8I0ǿi ~;-,fwm5Ek)DzI(ΓDɃakBm2&<;Mٰ-'St1Eד%\I"uA:(XfhTYDe7kQMƮXB"($'Mm2X4kn3`'}|WLF-LʴBE5Ff,(*Fb1;z@U4hp?{DmDVc-S{NŒ35Z&z`IQ{/AFk&dd, 6UK.dP?uN^y~gP׮BTD oZ 05^%h#d8N֞ +JOQA,5\b&*knvM]qc $LNMh`Uf4D2_%d,<`VMZKIH20DqBN Z&!Kk|A9oU hx;S@]pB T Gͫ^"~7_tVaTw M8 >ˀHrozO?|IoPt%L5:/6MVfӆ9\E6o~ݮ (c >N@Z-PVZ|B#$ uXBP:,a uXBP:,a uXBP:,a uXBP:,a uXBP:,a uXBP:Uc@`?" Fv4B2Wcꐵ=vYy%0 u ^b~JjgtVV%>@ze,_g+Ov< <vPlXAw`%ΐ`vV KPi3ZG5i?J(9 ]nR N-? lfLmfTT  :%{H+w"6GL:UFeƒfYKɳeC fW|bZ膘jH;z-,.gq9Y\r,.gq9Y\r,.gq9Y\r,.gq9Y\r,.gq9Y\r,.gq9Y\G,.s?Z\K#.JIq4rX̣JmY\-˵SqWeȒ< xW:@g+|"s]{⠗u5?7->/yޮ 7~,[<\E5/^?<=̗= Gp}q9b,E,g/^.V` g>V1>,r*\ǟǔDr["p}Տ]QnzU^WrP\AJv:Ñ^ųU|\=5%ON^/K$06 ~~z.{8o郗GVxK;r[GWn?N{||h9vxx#\#!\cx t !zTv8G&'8 5|84V{ ?1>IUejTi 5iJвB){at¥K u ?IܝQ5j/2㨞HHH+l~{5w?Cf1c>&{$}'"ȧL r+nV [p+nV [p+nV [p+nV [p+nV [p+nV P+hyѴV?VKn-T㑑< uXBP:,a uXBP:,a uXBP:,a uXBP:,a uXBP:,a uXBP:F7˷ϳMЃ}߼VgV@pk+uXJ;_)V0#2D$ d^fwr{g5uJ<@=,"Ӗ)w?ٷqt׸ݱPXui][#|-|Su"g$TDDZb-9o$FzͶacݺkO6 [\]4lM/G#Ce@N|qvweI23S0Og=luE,Jj`F(b%HRhXɼ*/"## [30H%C[>?u.6RUo(+EGumHȁmE <6\lY,TvX$` xY8W<&ppg$d[Q%qK Ш&f(Kͣ"ګt"Ϻ:NK>+`P'ZrD'D4 i4x&O퉉&GvYK{+_ rx*xGH B1 7,O&qW8|mqCΠz;̻jsDJHkNQzN4slNs hSXMj+2>G'ۛƢ8mp4dœ_ &X2410j%ߴU/p5 b"1s,$Tu r'.(`s֝Gv;.:#2t|-؃ZG<9rIBRYb=BL歋Rlh6}8 +$A߭;Vv ~^}(uiynky#݇*%'wB7n7J=b *d6vz_C+ zU?ǣG]eq2:G݇q/Cހ D{"p{eo\uU/I]Y͐ҷ@]GcfۏFF/ta*z %qqi%$u@9 M}S]USzXg6^p_"ȓXmӰ"~Xqv}SâM[Zz|3LU=@!'ɫ+;q{@|CׅkD0hQPJ7RI5Û\0Ne4(a7Q"O'=v%GGtrG]U9{7 ܐ#a}ۺSl;Tf npMp.:Pn3|1fLG%sL7 ^eF7#;Hq,Yrr-ĹQ)0UGx x K:s/U+~ʣؘ]$|s7W٨8,7,)ԍ9ȨD#g6S٪[-lC5ηjS:\InZ/YЍ.`O5GBkhz{bG6s]oͲCh%}۶Qww^e kR;knC [i|Hu5ꎎ+3f&~bV%w4Ml8yf|z%3n 9v q)_V:Xv`EN: }K0ILˆ1F"cCB94Ǧ)窷;JI%^~]`FeZ6DL,IE);c:OZ0G Z]ᮃ5Ϫ-VFnZzםJ~ʶBRJ9b `s<1q|>XWP$RZX" q rsT#(zLX'Y)@!FGτV()5{84$2gx:%"xbAsX = =7o 5ι1eYv76L3?G+Ϗ}U۶Zh5 1(=.yK9K+tWfuɳC8{eaNvTq@!yم4[ģ9KENz7]&pV\Or`vxel~m!>/نkD $dƻ{=.'Ww}s: paG+CHd|I3 o!ZꉏI+`4H'S k7>bX|z4ITtGGjBH|H% l}εB2N[$YB ~+z箖ܵ>G [noydGT0fj9h L*/<(jJL6rDŽ]Ȁ5۳}CT;-D ʔp@͞E"F] LX.87籞M0@3bTiʌG%Q,jRU5G{N9}rIs[']=pێ#~Õ?y]c]ͺHEJ=wG$9k`yX6!XY2&zނwT+n{_$#nPw _RcsWaM-7bKIP-_R(3(f )#ːa& 9$ 'e>+j4SO}t.ui]kF,+ג,2 .AX+a2zmr_mn}&m/T5/\~iJp4adY7=%CƬ{Ygys aEE!%@f-h;mr.jS+e3_ 3fϘZxٲ9_z80S<8zxD˫j.NV&$5ldGڶ yj xPU1I5x/WdRͣuCs[Do@on٘rSR%;̳vqPpSn|ԡ ԪPXEg&cުRɤ˔޷bTwd*-)<.Iuh6\_m;P*1oG|/Frc +r}@zu3QC8f!BNg I2o9_BM4҄R4qA2J0A$Bk-`1f$_uʲUqv)pɢ--*)ԊG,{%9R cZc=NdU.-m[m/ĐX:qP 63jc"0 t ʐ Y-^y:-Nt.uv NJ?XM> U.yLj-N}SqZޛR7;?jp# eG{ wefs9"'#raN)0q)-\(Jn$h>@~t[䌽kMl2].xdJ 6LFyar2pFË2K" ϷK;^l(7$YH\li<э,˯!'̻5d<7xsx=mY@o^dӨMkXZdy$,|>m+ d~8HN)ҩ׼xx7@_?o?Oo?~_??}O=h?̢}Eb 1 Wo|?>~3F\b ̕aֆa `eŠN:Ѹ1Km0 <?~h!ƫfZ.W]mxqeKnr_RhM~)7l [1Hhp1[ShҀ,N0k0`7S؟w"Yc#֚ Ky-.Ln 3gJۨ , vD҂)K\; +IyZ {ޑZտ?b|v9a =V:v6Yb)Ƶ8EbW. PB5M=|S#Ċct4.sh$=s˪Av81jm9já6C1$<޳ZvGB\RQLHocxP"WosFR %O/Fp6h KYT,Bp (1 |eF^aw7gc cxqM ܶ/EX [dVьq;6w<FGrQ_1.G3'Ӏ(^S,AP~hZ{8W7񲪆~'; fD'Pz?J?RowS˪iwS)qպVexhA,ZU;cn+qmS+ [“}3'\Zɻ+TEUqŬJ&+ä*\\S՝wWR2qC\ BI W Xt54\ZEiq*AUp%&+HW($ ZMlq* ɸ#5҄pe:B*`u\JK3z+@+84\\)S}qq*mvW}ĕa&PHg"ʕ6\Z`U<2G\Yll'ؤj2MHk6yf֛c¬U;Zɵ~ŹZq`J1\2Wf=\I(Ddpr 6W$(@+W+Ƶ*!\`x2B+P:PW=xmBRdprm2 *yw*/6d\Wnp؂̍C9Z-ѻ1.Yv j acd,O|~9"Zf]b?h(*^8O/ -΋O2|b6S1k'xzšbQ5 #K!*1e[~k$L\rB!ܢ>/.7U{8k+6x۷G$AW!aE-ʊ JwNٹEvP W,(tTT2XLfg奂",.TPuUL/%}ٜݸM&?gOR 6'ZL }»بRb0ĖBKIW(WTpjM0Pe&f\ TPMW Q P-l6Tٵ ;2^WZJbyBTdprM2 ԾM U<=2pe,++Bw+(pz R}%vڲ<W;>31Q KJ,h'xm;@\S);+W6j۬@U z6j-9ԆT>ᾝ) k @L]Za+To͸Vpŵ \'++Y*BZtWfwG\,Q >6r2\Zyw*ʸ!ڄpM:}W  P-r9 #h鄷`B8R+YA*Y*OJa/oʷW/wx)œ'> QIO:W:[ލn4=M= tb-K zeaI /A_]?>-{WڂCuq_<<,xཛv=t±&^l2~R̠DɦGqQr:^@PQx\<0' 27TF.]LUտ/V3>&Z- \'B_ Zx~A3=d8Hsnr5.Kro:#76hżYDQt!D )d\Hu$;5tRƕY5}ۚ{f'}3[[]/g+Դt.gHe~Y+1sP+yrBV"V2DW-LUYŜ` _\Elke͂v.eb|4;ì\mP~'˨?dؗe|D!+h t0Kh$0Δt *OIuI%xYu:m XVϫVXNuF5Nz@_> NBYҕh^aj1U2uLVe|gymt}R+Y&ccpy7C19\je/+e97{jQ/+FC-V35ъAv,wq5[*ҔΪh)U:"D+3Veee<3UtTkƤWAz$^;^^>GQ:xcma &,"Q2jm_7"J? P,J]ޗ!0 a Ehʣr.SN/NZ6RLԆNm%cO#ҫ v;/SYũbCZD+4zc[D#˘f8Fu)E)Ҝ =79OGiΌ`hU7CyeX }(t hS0>o\zyWx\x]nqS/hXM写(.hMp!JZV3xU*Ub^JE6LҊJ ^Ekjnqk5zvBv+sQq(7jPH9.)҈:rSQi"ZwmYg -}0'dxcuJ\QBR~}ERl)[TW* %B# b!hQ0$rŐ (2v,grYɨu w!KQ+?zix2=xǜktiꣷE IfP3妃'e?i&?jw]ka )RǰF*1m4&rAdщ☄#FbO;ቅRhU=ջ˲hTKu= ߍ71N{`&0z$90c".G.2N ޜ $K[w;"@8)5y9)3y𩎅 @<[âe0,,<4K)Ɓʇ|&@dR,݉ nty/q\ETAy(AjP*1mWjH`Q-'.q3`B1("PX4c3htT`B<UX-hb띱Vc&yiDk45[$:+Uɸtc~s~ 1Y~%>Mio\-z2 E"Ψr{k~Ե|崜5.66IrpҌԟf=W# ^{P]~zXIgcNX0EͭOnyo|3?<7r=?5W;7ϼ σn,|MwL\[:N^T7c񛡛ﵜL]8fygBy~`vW;:wpR 0.i嶉cֳP|*HZPb5Fgg|?*a?}DK}ɜuRZJ5bQJh>jk6(Quy-sL(QmGs+:2,"1rbrX+/B"Iy-sX{ԑ$D:۳s|Yn̲ l}s>^"@z"K#ߏ`4ȕO\^RDG%Uޣ42ͥ1gu.r;]HD\rT.IaLxکlI^++i[y:߀1T)b+t@HmwjRz\Tk`b#{(gݵVh{=a]ՌT0ghz6BL2I26Uae,TLԟ@)np0sH[]D‰uXTE fC~ }[<cOA縫 C_C<>hFGZb_,P(IQT[rm4J! yCVhdJcL l6 D:Rb&y$,ViǬQFYJy>%ckY;#gM~}P.\=DI`Z˞qWc'YշX:] Ѧ -eaFsw6eE!"AQ 9y7"~XK04.Cx>(jė* h ×^!wiJERzR~}<< #),:r4^%yq lY:@s[xztlc]Gu}y<8[Ä0_8蟞UkKm rp0|jwb|.nCoUm#Q`*&K>ܳџy8`ֻCݵYgwCH1hW_6S[z,MJIW_;퟿|<ݏJ~|{|'ww[:Q۞j\Pz}7SzQ׷CcR&LziFO0|+U.z =[ݯߚn knMwor_#e, Q!gq?lcqtX#-v/cQ=('Mʰ9lP^𓶃"2(|:#S7O5;oZzpUPQ,W{I2Ncx%JbRc6Tq8hLr哎jn2_]1O"rm~ ETXV*WHw2:㾛u.Ewu,x7Zí"þհS ՍOҳT&a?I`?}ZVWz/*} rOC޼ >yo&b>|kGpϾU?\ȕlE-'[^HH}Sh$lזPaz"l[1"ϖƄµ:^nuߔWM{ ܠyQ^S/LI_ z~ЫZ]Ua:c7gaG=j<#r`w Ěf:h)T5e`Uvxu(RW`!F]z(uhUR^+H]QW\E]%jޟ%*u>{JP!+Af &r9=u6t}JTJ TW*E+ X#t0*{8`v;mWWJKTW+ XIq0 HJR*QeVW/R])o΂5ey-w߿9jʰ?yQZHBVbJ 7.Đ* atD c|ޖ kN8 Z:k޵6r"ewԸ.yN' $8NeA]m˒#{Ο?ŒdeIv`4ŢHGD=`@_h;lRJ|Ŝ K[WVAUv3д|܍oD9O?u;|Ŝ|),[o9t4YO`ykPriyT ʜt$A&%^zK2*NM}鿵u'Z)J{!:2TEeIx!&霙,KZEmJp3G[KH?Y&-}f?P dXwY֐Yǒ)fSJ༷ 1iVgJ窉f۹EO!W6r4` 5ېd6ぅEZs!1kMMhBk'n{]/2:ǘ%꫌Y0j F꘭ѥhrSTL ǘQQ뷲ڵv-ߔz&Q[ zhڀ#2 ȩ ^P:( 69V2 4aV|Ұ,v1. 54*SL4( TAQcI;ׄ=:1kJ$A0f"j(>H߈ɄpN3% jZiԴְ=(4jQ-cc2SWt ̧Kf)@V/G/+jlU)gjh^XZxex :Hujb)K66 ~;xclhmx2[+wLVn%Zuz;-Y@V_UI^k)5:n`b$L- M?o돶rFgr)\ӘE˛(y,m3%BOYEr @ioIJ ?u')dM'Dࣱ&'  ڨ"٨h%bƷm'i;Q#voD* Bbb $9YVdzNrɴEjc9Y|='TjEm#Q@Z*P׽܀ g<r{St?+CͣgrAn#w˳RCMH:u@I鲁`s:Fs9AHḁ̊8QFƁ8ㅐI!nV3|Ԗhis9#g"s9?sG.UkjҸgk͓wK꒩V,=;,rΕ:"R ZãPf(OYM^sp1`Io9$8$nF݊Aśn<˭gbZ%96\CY 51*BRB РgSqOJPpS?M{OLYuW JPa%($+4 *τ5>(gڥ鵖])-)3b> `!:#.@8EVU6.=z:??{q)@ 0Sm7^Z7Wbf%"rƝjw.4LhI:wTatBu3Jͨ"S#OPLv٣̠33s`"|pQ1ghre,:W]I|RMZOD7'7#CK)RB>c9w~bukؐt߹v!r 9*Kl-™ Trv =J0a׊~nJUҜww&wn]u\bƧEp#9s_Lx+oE(jnUHӅϙKҺ&kG~ax0(,Qc*,\,+߻1z0f;09Q9DLGa(ʝ>cXg0 ?c\lxBúN^_8_?O\^?/? ˧q r`B)gs*OFt~B9t?;йpN ?3tWe5,zԏ85IS:aZƘ3_e  _~}h)Ǜ - 0fMz]UMNf XbeB bn?}Cb bc86O{/uc9gd2 D1P+#5FȠ-[m~FG;ErmdD QI\ۥH`rx0VS֖އΠ֩{Jտc<PIE,ÚGoCt1_`91+ѿYC݄jd7&:HG F`n(DOd&mzw޽ܙ=- /LG w;95v S$wܸSiC~7BƝԿEL3$!:U5ۼ0.`)Rn|4*멛^0~ʍoMb&|-xWv}hboggk%oP_

|`m/ɶ0pkCVdXnC)*-W]ۤEqPZq4g͟7 q, wMAʒϔ#,#LEтv`/Gѕ|D|cUDuN,)Lpv)a#3&gddUf m>P8e%)* \h6Vii+95 7ҕמLҫWp:-2 @E(ԂT0 \눸jaE4j XE1x9_ ԃmK2knene41S@WBJƅ.RYJ_6m2d yf b̽kmaUAk㑻aZmCwCJrVÇӭt@5Vt&=ODZd{V7Ē R+bC$I.4::m^|Jr=w:6=7< cs^D4d3>&EVWq[JLHDޯq.SDs+h#z__vK-Yq}JrYcԵ9Ȩz23ԩƂjZ*;%ЯvӺ5\uۓVuۓ m$ji$*jOhmgRxdʅ dc#|$Z0x z=Èմq6.v q1gPP3iy^gcbsZDT +|$.KMe"qN)G*VHBL&aY1ovoDEWqR6ꦇ:f{5/dxi"ŕ& 4ƒ/ '^3K|2;+:z{rָ̀TmY:+:KMeisq 짒xpח5 P˓(hR"f=z?S+p!E "-Lk-ɉK=&< .wXh[cj@;dri"`ʋKX42DQRDz_bH_ hVA˖3iy}੖<3e!MfNfʤRhb\.@d SuPFkVZ8@eL"!0ٻFndW6eއb3 A&UMwMb,ְhIuT/RJkq2R\ ފG}1Sr׿;騍C+fՙީ+"yY$y IKE"ŇLX TJTV)j2p8s:" e6=FZϑ^0q AHRVa S띱Vc&y\4zl5ͭH; OB)ۻ-9U&- fW/_8噊oQ얒 pDF3+O UOVߌ:+Sx9K%A҆)J97f cZaȀ/W}]6ΡtT:~zbYGJP2p>4j\-j^(0?SM>l lg/tCœ4Ose؟3֜vZt[;‘˗_)1X>,^ t5CK|6+\:(l{sMC.2+-LJwoN;kHE>ג)E 0Oa^3/TpF^Wg|+n.=\/rq}SтӂɒZ(ڜ-rm|Sv*#}f ۙcf)Y10 f1gra:F j"t1bYl!VIpbŃsQFaZ)RcAA;8F!HC+M>]S&Jٽ̬sd9m96jLےa0AP2dX^Ӹ/xkޏ1chMq!ͲiSon"w1EfTXJ 4#(wR?ܛ*N77M,\3R$r)n'Qš{{ !_}pE41{iZJbpuـtݙNr&ɭp$w8QDŕC ZIO ٰƼ1}t­jx]_w!4+YpA,N>v_wOQP&J.F~|3|ErXo|˄ӊiY;҇d.W@#AQ 9y7"~XK`$Ap_uxm&B` Sב HitL*"Q* J3K`@AgcW 3Xs)u\QV=C H}'PqfN\oJi $=Gȣ}V/):X1SD:7s)õu#N|nS  LNq !zr$w0 ]wQ&/./b4M!>F秾O$ 9Sd`dLW"tQN0G)dް*MN_5V%wm֊+XBbJsԧIl\`(_=ٽ0@e=괓UVU|0AHj?(sdbR/X*+י\;>^?ۋNj?^`.y~Ƿ F`, ~<#a2g~~mu7=72,%+ݴ&9*|_7M!C8y9iho4Ul%דߦ]Qg+jLPk_Bq@_Ls}袍񠍴I羝Tn7=Ci^xRf%KiUg;<=UdCmy |O*v.yTaLYx#A` &F֨{n R:Dd?}uVkBu0gtK-[LKstMTaN"Ezh匰`ZM4IʔR(9W6rmI݄ejF7{o庵c6Xr-" ;?ջcoIyag+j,7`,qd7CBYhWD(ew0T{V<9>_鍣ݲ@lL|M߂de:ԏ影]PG]g<k{ؓjVq 8 %rv i)׿h*`g#w-- Ўg$l" |TR-Y)j:ftAA?Iˮv9痣XC|U2smC Z߯B|8!]j(vlR 3ʥe˟N >VMxn(oEh_mXe FBs:2_w! a(JOrfFYQ/\;E*}FY(N3g6feo_`ڔ  2v/Y _oӷwa#;qFåMMRzݪw;dڌsS6ls(Xπ2ZmT+Zoڧwܐ`@0}uKJՔ`@-C%*i{^iDU5[6,3Cu1s.`97:U%u3|%gS 35Vk2V7Ē@a/~cŽ}mΔ䣜\Cj3V<3Z'je.g.^ -^,\ OZHs{k|.F#Ў$3h9Hr1 u:wʡd :"̅I(BQJ'ٻ6{$W8}J9gzﻯ稴$ !6K-k ~eZ@T/P{ARCϹ ^J} p2|xq_z Ysf{ń^LU~QUٳ"AHY ,AcېZ+ל mԢp[*dzJ6sq.[\BߘD܅.}6  QghϚHortpf bngTx.p`QezHIˢpWisG!+jg͆s Kl`P񹸇,>IԨeֈ%$?0}4@. }u&;Y% <_{I[DND< ִ:'4}٘WPYGˋ X`Z͠Vu;I[cR&S SM xPY~$mdtBM,.81iF$d#CtDIdMWk'#꜄>Rx0mNDhl5:1ض)\c[:=d@0@APaC I[k1̲DR.6YeiFwh+0ݯsfRCӪXc} 5a&!ns$!Qyw4׍5]0Aȇ2gbd`Fpiu^dBIMu-fr V61%$ރJj ZI #&!d! vwm}kj>nha7$j] )co3OڕZqooY97u+7߿w^dBY.{Ew/lf \T7y2魢7̐Jwۏ`lo?Τ7'X.& i0I,̸=|MV0b5a@xW4#O_s*cN.*L.])TJ|ȉf~)P G;`E"ZXJ?|160]]6zo?T\ɁSoA{& :ۍ鬖}ۇ1[sZ/*±Qgf'kHGv_F6@xRM r 1qTywO 䌡&"uEYʳXal U"\iAE ) )Bz#%nj~(ܷ8ڂ~yu-y|8xձ ug.2ڟ#.v`cu9ơLl030;MMn}-į$%La1{91&&Œd]b]ik6kI6SJ: 2Dő񥅳Dk)Q292׀DDA"r[^7 ael_0 ,.JWrJQD:־p8 Š],qwaO~6cW|(hOKnU{Ljߗ |w-|xS4D<IVxE@%4zG2V_)+zJW[< œ\(T5, Rn~9CMmrfϢ6yFmlkcxu 2Ƽq6kǠĐY!&TH~/.sno1ӝdLv_6H:sNBa ,d) DU 0xFS 5;Xy`>ɯYYc9L~2})fA7 > 5Xg}Vzҟz~`jzZ} mZjj0'b"k7nA_Tt8DZ3j=mwO9>tCjɨ%[P[G=it 耞Zz1>~?s>I .[s5=WY|#lN~nWk8ZF̗1u@da_HL^6BΔR x|HXPRIU/ է.7Ny]I5m&\Qf ~}2%"L"sTؔe@=#+3'ˏs!7h4i}=eMn\>VB$~, TOnGy hqo sJ _ҍ XPDjH-1D{E P)] ]J5qCȠYO2t)BCQɁh彳@PpaiTܘz80mB؞;_$ ޔ ֵ[{]˾#+iI UL)hd.ҵĄFTt0y!Gb:#Npn,@XY*#a lUHpVU  1wN9혜vvhxνBp\Cxy^$$nG/xs"Y 23WLU#[{5LBy>U!^4U!HRq1ZJe yAA$1^< r`A)PG}Y&VR KPXrJ"D٤B B<ÔpXz@/,sGTӓ.I3~`Ki3;cv<cnY֝C33şhyS1 ګکA=Kd#mBU^4z=XRxt/+txni}~b?hE]NL y ~D֍yK\?,Xl\4<\˘:hA(ΨM:y=V_Lz_FZvli ֔uD tKb:\lP1og/9 W4`yAauJ_U]\}Dq%_?Gum4\1krc>B8 q5V f)/ٮ 'wKzwy-*-{IDMnϘ?@[ʂ1₦٩8VAm~5K;vW{Ͽ"""dxbj`9t~&5o 6.\m'x@rQ %Jh쫤84]hFyAW0AיuWЀ320RaR-%} ζpjuP ,0FO:T׌mvL 1S<_^y}viuO6CV9WvS/Id7%~^% A9'Dy3An d1x wI8FaS>7nqcçB(˫b`h{s5)xcfP6?T1]Ex2IypjJJAvϡ7ۿ";\U;Aϧm>M5g 8oPܫʶ{rBF˵@%ՠAѻE P)] ]J5qCȠYO2t)BCQɁh彳@PpaY˘z.30mbEŞ{ >B Gi^ $C@嗏"GmDR0K*Ӓ&#k}Cp3fsm|ӦP,sIi) FQ*Ţ ƣha&RFY7)Pʍ"VFudXDcZ)"VB"s6HrIB?rSq~ қE6ه#W}MXmCLO bd:oͺFE4(ndE3Lxv҈~ X0b #-5> -`_%\D~Xˍf5k\y5NѴ!.R"5(@{}/n/ĕ0b cD2Y+냉KM-a$EV 1l!+oP .au:+՝ƕHk/=vp:DYe=|w>Qlm ߶,GwǾ#e{\Ù=c -C;g'f;Pm w޵w.mC%n/ 8QVcw;$)%:muG2W|V۟b_}y +Z=Aㆅ xwlT=pk3 nZ[3 <67bdݱ@UQ5~;mFLTnh[|5J :M":׮]zʋBQU?8}|y8'1 6J'CZa ?d@Z8OӠX 7.0E`_墍]_կQa˪_R(>~}$blߣ=.s'H1XÅ }j!\'0-ƟRIa@[w.JP]&`Ϳo\fg^GGX1& w+0`8"a:(SUoO)˓0e{nS  ,!I[ K ͧoaAv>%* i>46>H2"W$*ʾޯDoޜN/Ns$0咥U0Um]q QOTM9LQ,=S{u\^ܘ!Y~kμO|w==o[gk&8!\σQ>7kK.8F?_O뾠h0zIΑ_?e0{:Y$>ƵL4x4-tu7r0p@yCvQ_f<@>O\u+iblhݓc馄ngs9]9_~Cỳ~p:{wgZ/Rݮ"x>Ы??7*bngŹt'\$mh ?<em5=\meY3 ?Мƛ Mbhs>uی+r>rǸXIv"D tH6HhXbqCI7Mp=o7?@ vϺ6%AF=iR;:S4_f1nK{S5PEnGȔeJK7BTQ7;2e_xࡅgœD˃aڵ̛hW&)SJ`_ysoBeل:l4SH!R8N{w݊:N!|k0`%]07,Q%uUasKY󸢷[ |6\p I^ ~p~Xk΃Ĝͣľ^87#(ɴEbMKsZJߴg~[dŭM z2:{~h"XEӓ=qIm[N?eْ:=ڮV<+zĿ @GzoV(rqA0sN9П~e$"_jY" 5a/f? dY =0|ٶpȏa*? <'h: jt lT=Yx$ZdGp TU2H ]25Xoahpt> ks{:Wj2@`$d͏C+ vi׻%;UNAe8Ru:wʡd :"̅@k#@)ƬJ+PR2S̸Ih j4!13ˌ3r&wj< { =V<6} Iq ܥ<"kh l*;j&U{f|FBX$ `r9i@\*qD2r5pBј -u-xϰO~QHtF8E Q,H TEĢD<+ BX1iDF4QAJ((ibHE\dM3Ylg+!6juu | Vx@pIr ScvBZ-atJ"(ٳ@Ny) -3?&e{c9̓מ×tic*ud4p'QT9Z٠{;Vr5#8цHiÊZ.f`M6 8"YAeg 6ji ĄFBYwWSN1e)n/@5g-GjIV2|HdWSok{cE(`X&QYx|iҍn;QȮܡ3s~N*)Vgm7Ky;Pe)>dǂHStfO!^$DhTK --% }{~4UfL;IE>7w.ǓCwNVjeuVSyֆsww+r{B'zZW]XبXz¿jp%3d)`ۏGYZzM i)OsNfL S2qP^)jfX_o_i'^^r:0 )[[{S(sVA-m=2CJf/h̒,%d7V7raaVNo0cwtO!8ݥĉ Ύ'w1s]A D]B=(]J) /:gYٞ$ o'>-PUȮ_GRup r<}܄{] "9cUf% ^YQjQ8,C=7L^Jbcm֖_߆A,L_Y t:Nv¬(Ե9(?ֱSj;Z 1k\CϞ&@oj.0iӛBob9HivVo@c?{׶ǑdE.5KDf}``< 0y똰DrIZk{~Ov&EJ&]]s2d旖]RYH[VC?%J4є7˶Oit>xqc֏V)T׌/\[*u _+}U??J *`nH:(,Y_Zͥ$KԭAm>'.aJct⑻}\{ʆjGs|>|]'W18m+ _/eosN|8eON/v\Zo^)X#5|jv-?p <ܝG.8rǟ>~t曃>gu0y޽~-ۧF'1w6}syWuc\Yˇ4~G5ڭz*?L^/Io S|9z$ф76={)tI_}4ӉeN~pF<۫=B=f~p9}ݻ*(k3l,w>titB(?4d+Ƽ\Y~_11(Ү 1:4׹Uowf.yٟK`&?H"{ƿ喆۟;p~@_Wy#Fe8U0G(E yBrc{?-+y<iQo]evsޮ^zv$Cjޱ[f\$ ~>q`k$PmdKbxk_.#>2X{П~ȓ}rtWRnd[Iv(vS4%eEJ(laO3d'n ד_Zɕ\J-\hsf|0\Jsy7S ʚ-;`0bk;(ZKIqBaKkX79h-\+>%O5ɻw?=^Wbi ڊ!6֚H 0ٲ/@R#%r7fDhy{n -08  9R|h5]Ҩ%gba &<Loz?8G\mSq] #=l9<5DqR  sHbGM#f 3 cSLQB?>'E]>_͵x u=t`#u**J P{PObs!=n`gVUM/k>"E3f0f$f"7-[%'בl-76c-5TT5#|w/ *,{u)ZJ-F"l~ӾX}^ȴV&tsr^yR=*f%:'s[5z;ZCw4-(bvBd] X ?d0D[K<=J9D1<`^vu!(7 L7Cr(MWEm2B5O5c;Ⱥi5 %dCg6nhgi Q} G@Pu&X4єـn(m4hWunH@n7 (ʱ7\0Ȯ5cq^$qKc[|.E&*VlPeSeʆ4jHiKcQPDo4Af;|t`7e4UcH"zSJ-rx P Kz- Ÿ~˂bh4vME &܄1:?% QTkMI!0ΘZ,Ѭ8Uܶ t& /5LA&4,l %ZMYx5/PgD(E{eh_uv:  itw*.~>mJxAc8Pmil,@.$@H=aCk?62(!M^2" ˝u^{ .M3iΘ JhNo2ƌU)8c#Z!ԄKQ}3]@<ΰkg='bw+.xͿ";ޅJ0X`! Uk8[b3x(u@KuDm;#ۜHWshCGAmXg O@y$n,G/XP肸@z+PMTLF輮"@')zGN֍1ڶ"2 :[q:Cp/g,X]Nu~t}g:y{M'QlJe v"}X6oo|(}v9JVF3+Ⱦ9 \{#T<.m%5X/69 hT&%Pb*20Q#(~D@ʀv1ݴKp) FKlĎ a> A[^F7@!\c+@kmxS 1be@8 18^PYD pd+!nƢ8ΦE`e9QZBMOnPk(z(kp9{`N6a" cA8ng0h! c;:QSTh-* ~L{stS}bl"X#ygQjA)hXKotԽ"a;L)퀄daw|>͋KPaf. qgkќCY76b|y9h?]v˴G'L2{$P`ƃy`l|)"lM F7Mf1jpknfI֍AγE9-Y7o3<>P0cJ I.WsgT@=@m!% 49+G0`}vIyVPU|'lE_qh"R6Ԇ%zX%rFtWbE7 eP87iQH0SQJ#$wPU45?{oژj#c2]|61x_:R+ i*Hݚ6 |;7y)GL@P)!?\]l}~9(m8InNs3 Gxx!PL%^7PĴifae Ԃ3_ Z\2LvXP*KF$)b#FiKUenT4 XZ*c1eM+ 0z[78ՌlIҨ5s,?#t&!"$I0@5/o7BwQG:*Q uTBPG:*Q uTBPG:*Q uTBPG:*Q uTBPG:*Q uTBPG:*Q uT|:`^P'y9Bz#/Fk?{A:_P|^:*Q uTBPG:*Q uTBPG:*Q uTBPG:*Q uTBPG:*Q uTBPG:*Q uTBg+ % u`# &b:aN?{BQ uTBPG:*Q uTBPG:*Q uTBPG:*Q uTBPG:*Q uTBPG:*Q uTBPG:Fѝ lJ38zt^~?~U/>A5%`痣+D/FW4Lx"XHuE_K;9տ˿ '߿'[ׁ?o.oq_"ZAT6ūpȵ73N˼7e!*a2Kg``T{gy5{|/2) p0׹ZdD.PҼdsٶy+vϵOs֗|JCDi [ @. q#i6K $FvnBNW+ }aa)T^;>qTK"tKJy1>ePGG(ۜk,%CZ+n 6I>Mi{~F9)w\~eҞ`[ORG1lmڨh H<8@ֻ;*CexJ!K _K00^G Q}*ԌU%vx91>ť-]RPw_w\8lH7DcB#%ZI\x<**\:EbB,f"j\q9Gƨ| Ao?;;ӅO Jw)֌XbfūnU{:^ߜ`~$O͇Bٻ6$UgwuIֈX OgrHaUσ/q(JJH`Px_ӷ rul_a`3 ,i'd˼`> MMUx0[PYb{ߢ+|_z_)x-A9s<"|Xd%,FJH <-v{NՊ+VFm새Rri|hqWCuTnE,w8FeM>})w6E[i# { 7#lm͆800/?0 w,O Om#oY[1=uaF`zF®$QS[9;) >&q\YCg+#j>nJ4&RRJTpRZQI/jȜZb1db>:3 Evn*Hv¨0fl 9#~r@AO[X)ohs6 \1_S}E=5qiR3Ey*1|me wҝ4}\[]G?vQN.iarRGך5w7ՠ<,-jҫ!+^r%O2,[vI wFWϣOvws\g+}=p8-ԜSZC[Bvc@H#J04 \s-cѥ3Yux ,cf1grHҮ F E/ dgyL'ki,+a@1g۠8ZI}B79(یpmwX_ϣnlf?Umy`{$a*sn-E^`ȫO  Fn|ql-kfK M@j BL-hI ׋f--_ R1ճH{i t6̓ФON˝Y{*|8a?YB_jv /oYdydׂi ,Ehds2&@@[?4|A܍ӹ]DYڊ&w~&K,x? Ц`B5Adu=v<=wq\C-K1t(]+iY foRus̾paNA2^T`BD%$@tmk}.\NeeP|x~vcֳ |ΔU9`ydsO|VHnjXϹo|ۢP*Z'4Pc,FER7 j}>[WUM#jЬ#]j> R׶ǂ"Φ/nZ)Yոa>%}Uz]ٍK P+x2=zc*p1#R\8p׵# %(_7(ٺ6<8im\Ks EQ`tY/7`"{&CDT)RRA(<<,sRh6t!7. Zx-_gmI)glyU&p߽dn^0{|:\{ m'^(Y(^&naja]Wf'&u'5l' }Y.t\.F]rBས-W>km>EG踅0pP%[Xm{~҆.'Z6̺ឧTh[) ;du;6~7{LI#E&)<'4]ܙwJK:M tңr =ޫʋBQE?(pAl9Q:" c!_Nc%,|d)'n .7˃~oܷ}uA翯y )fu 9} Lt<9^*q9gqeX̵Q*g D3bҧwt2Hְ?!Oҟ:c ]Q;f[F*m9f*^ؚoۤE nNNX9o/ +zط )zjlsc̱|:v`_lI 9Zr;BM={UF,&V<_d5>ϟ]0_S#͈GE爧atVar <ms151"shRa Q\cAxQKa/ 1=(sjZ JoA!'E<Ė\͹RH B^:ǐU!I)i.#ap) 0vf2p9ֲ|U5-AfۇHA=BzǻM.-eG3;I&sH 0: (,x4n],\ [uTHH8>G8W!z,uJN?v?>bNwo (0 .=)͕^d~~}}3 6J\7)ٹ)_3pW&97_'_nL6(2WwJ`;Y6L=d7?>4] 5CSvZ.&..ee-H.|B`<l0[}Z]l N3(;MʰYlP.$i؟#2b\9\'be  0- wM̓fI^)u93(,@_iV3υ{e̫*wu \(s~B()LjZO{ZUDW{[rU2T/'A^^c-xٛ8>=(S9=Ira˵ >4>Ss(:LMF{PD\-ÑRCБ S` Dt1¹TԂbX9V:d I ( iB$ Lc,-g!)fϙîsst6jm8!ؗA g.|6[ʖ7())gsl#|z'IKss{^ "DG#q` `DpkiFPdDvOWG"KFNX m^{ ]fsoog]w 2t3(9ux,dq`(n ӠHDf( gE8;n*>}Q9Qƺ\QF;tń 1a=36pIē\x=qh`:*0KuېV#ØuvďPq&;N@oN\ʷ@]cinu`:-FCDo#R%{4M HMfkpQ5139U\$#څh 0Ne"s4Z"n6 ņgرݏGOgӃNoUJbqlvmKci BYV6%v:pΑsDfKXD JO$!$Y%pѨ/En+AUya o%8b8b <ƒ)bKT$YpRUGH^#ToT1g0/G;sZMܿlt!mvX)$7ۼ}wԊ(7Axe D"ReW bKu SaO\P 7 GQ~8 ^+=|OMPj 8 ˕ &ilJ)g`e1%,RG9X1G$P0|MЇI^X\psG5MPdϖ8.d4@"C愄HAɂ ?|CgB/hG^ m?K#7o_R>n4~~ۀ8?t8C WFqG;gmEnF&1o5te8 rЯ*|Mr/] oE/ 1Jpu**ߣlI1Ŵ2g\õc|?3cr_zj>kfk>x_0P2|f0Nf|AO7y$q=hba?G| W_{ gXCl{"o_y7PQ_m|= bEhrZ$ F*wHAL'; OʀKJHyH9_ xJ`& 6^ik2"ɨ&sJ`yq~ fԱzC._;}Rxsiq։< Z7,T-/A@wlMt-79]ָOm ޹Ůlud\^h,Xd:#O>}*ix -qqi%$aQQ4rZ=}#Tݟ/$KM Z{-v$LQ&5.-ޕ[i1HD#ϦtBЂI>v6b6[=;噜FCuUO{֏WD$G5Ǘ ={^Šӻ9,Z5*caoړþvPD2 N ~AY9* Ae(TAfPJh{i 8F|0%F%,F"5C9|9( ^qS[vv!VAU;tń 1a=36p*)c'2$9bO10X)J*itGZ 6zqM> Hr-E7./=b#~:[l봗G?{6;;Iy-0&s޶.q~T0M@o..Ht[+@7Ç6c:J:afTU[i. Jaj(b-xe)Lޥ<0FlFI p̼k3iU2lH(Ek18N :gЄCA,Y#̱օω gϴmj󘎟G.lL8Ev4&qr}Ŏ-4ޡ<؝\ evG*ףig'.)r&yӵ 2@v; sVO0ǧ 4FкZos8⭳t-D}Kݝ=z^j]rWoп>2~~a1/Gwtf\c~=[-: ќ!AGmNKN>.KeϭΛ]gf):ַ\͚)?nӴ#<ϐ9m fB.%Q8#il:K% 4]_g{VQWo{xzzC\o_֊}zs7ɵzb`Rsge;K{v`3o5UA:{^ʙί~9*̒-zS,haR?t*Ⱥg7bS~7{㍮sw{kT Z Qlj4WU/O A3bV|?\󬫆v;+VU+PՉW0uWe'/SKjiw8?V6.ths %Zsrk[F[Yoq<ƌۈ_=.jԭ#\V6m#%QmJ3h'LLZ^^yOۼPii΃(N*:}-gn/ba,ұu?oF,>7NA?Vo/$a۪澯y z]ydp3C,Ro<8EOP_Ϝ$cqk$26$sQJ/)2Qmŧ.7Pt$ZAx]%83&H e0U:jb7&.vK3uoM>C}1򵄥fmGv%oŶ@How< _q`roYm%<:HȵE3(QAԩ+(P`]<*sKl'F=>XXАȜ ^Nw҆]Oׯc:Z~.諝e _f$y mvаfOhBvlGAs9yG=8akp'&}{ݒƠq~0/,'sFPN~0jMP~LovLz%x9y?}vߊ? );Y%3Id,{/׎$"?DԼlȱ2HG4h'>rz(LkN)/L(6*MSAJ \2˖ˬ9j/*iFQUBIN@)FRetx#,Q+͓qU4eZs8KCb>S!GAĭA,!Fe,6ݖq;Jy[XL2-=j3} =sO'~:tOGEJNi@1s2-K$xbXDc %|͖vH%!N @8LE"FJoX؃%4eձcHnj&KZwm &@F("k{%yA >-%)}W=27#,[j&$ eIRH'7;IF&ox^Ma׫of؈.4tTOQOt2=z~vAgȋefJr"=Q͔ռ~Tӗxj~Gbh0#zմ5XϊѽuAEX]G RF Rjf6ֹ.֝j1av"9Aq^prq=Y#awٱېGt"mw5Ii}X+gdkF>FDUiIg:TZfC2ɲhBmΔBD3Qs3TV(cK C [ u֓tLB!GG'Y==˭`J9+ -7&!Qъ@9o3'~ xՎiХ<<$x.ԡ>.7*=Dm8Z4fcF'a9@0XEPܐD$GTE wv?+ C~L4Ew.JI!sbu~bA*:S\HfD+zMttt89=;\T R(ZOѐ55 ]MmsND^T9`~it5}p}k]Ʒ瓓م7bbA̩~tm%^FIS͉,U#M# #Fa bLah*K>/ =wڛ39b vIu\5U-F42"pʫǤrUӳ[zT{^'1}Yo0] _~7o߿+??|{}\7{sduB+0I[5/$4F/,^33)~< C,U'n\+uFJX"GX hOq5òra r ͏T[Cx󡥅56Y|quS^1o ͸46* ~WίL%⮵QG&MVT=̹V@Ң q s0 4`xW lJ6,*2k +7Gn|px7s~N]4K)Dȶ+U=2`ӈ%Y9uO;2ׄu#~*[}^ZD`e9qT_T!*Z9کLF5.HvL5.ۨئGQ;8_yc%| /ћxmD~/_6[6+ϊƱKzŏV%yߣ63L&쮀;.f+7Gv+.޼P"PCIn˒! ktL>RXEj.wŊˢ[ʫdGNQR$t̚G@v.,޻d9ۈ H &y)$M30Nv)?Fq&zOG<ĩgwSG} zV)sxFJ?XvV=:zv<'gYJim$FuI Y( pI_Fץ"wҳw&_Dg;ٓcYwTR#  #r#jz>2Qv/0Y\*QWܥvz; *Tڮ|KTWVpGVPW\E]j 캺*TZ٩4c+"ruUȕ|_UmP;uՕaF#uEk4{\fPkv^]*;gE+˙`G`!I2RWZԻJ䝺zp-2-'ʷp9%Ű}ь9WEY5A4zt9W\->ϹH[c[{] ymϵцxNi9ԠLuW!1#Ǽ"2}Ծt*4DQVpMW=!&VO*urϛ:w by\G߿%1ZDYBD,Lȯׯ (;kPc"q Jt9f&B4Z&3˥$ l%@hmg>sYK[/漄\p(7kV1uPËQZ32AW.$۵ wӆ()aV)l`9$0T:edifvKoRĕMR&R#cȌH mp]ѡ;TPcMNCnTKe(J7u{1w?U@.&}1hO~23.E5!bQ3ePdFFNW2'Ink[ϒ2wIg߲u`%8!fBV#dJ\ $i3ljUN!FƉ`І M dEV͖%͐$s([YSvԦb6vs]F 2c:ZBy%^H?k{1au6EK^CO?33/i2O1G}l"FELe̥C 2`X A֠V;?n~Ղq"I1VzgN&A8O.DnNEHdB$)VN|l'!2^L$L.5 ~=R7ԳG" (m ]_VG9j}1?}m -Ԑ֜ CPŮ`W/y% VP%@ADf8pQ%$%4&L9";n>#YQ0DU4F e@I.3OP+H1H%2kpښoQ'ypq7r(]THB9Oc#~OX|CUa\*0-$yΫtV+ 85$g o+;Co_HlgbF#*2xPJwοy&F ioH:~Εyrp-,H.@{a^1FZr~4/;*ݹ=}W]o|g]ъ-.a錫sףKA}ϕ1 >:5p6ք^\;u79>~n*?8<eTŃʖ/ŭ!ThH%=<81q$2v~1`Nv\ ,BJ\QBRP$!))0-4kz99ịUh4BkM̨3R*:|6r6sJx4Nbۉٽζ\c[Ʈ,˜ҲTg$-i4.P9P@ y'`yEJ+.m$<F$dI[jk"aSyYXۋYpq{~ڡk""^erW]~G'?eU^C3:ZYKs lA}Ăy TQ͍Q&)}l2c$'|ߔaT`v*@ۙ0LG+&^qk|ʕ"WW|@%GUR=r*W R.j@<,ZN\z9KŸ.#hBgt{]F[b)r6F)VbS%6A` Q'IpZa@^[E(:o;cƌL">BJk45[JeFΖ>ڎx 'ad]0,c2pg*%xDUɎ솮k-}ÛIgkkdޖȵSr*vycݶd%?.w9otfsŝYu-KhY֫unMo6Vykm<#W7t\Z+k2ϖfMm3O\9+EKc=11e|iNd7B{6[_sl6S ۶m8 i\~hPR9+*E63.Ȁx LqˀY,C2Ncl|Y:yNkV*8ʝ7\ \bZ1b"iS$Zy`V{Ɲv wqc}~%wj ~pX]=mn AA=!"{x1~ -Lu.qS f@kL'֌)pt8QDuq0@Wo-b 4YҍAZ>nv+ӾpBy$Z;Rs!-;u99[nA?-% g|<˯t@1.#+L[C'X:R! 1rRy  IM `]QQ87PNbXXHqy̢bTeO*w(V\e?:Y1U!s)h)-V 0_OhV97i=Qha  `9v4WIŢ%Ը 2hU2 %ZB))& zKoW12#XP1g&-Y2G)VƮPd{ nɞ Lū߂|zS"f^z«-H<5yތ?F\9\GwO#Xה]vFf8H&viTa|QQ&(hgUJy)[E'z3 ~:.T1p!A poMN#$ϦM8ܻl* 7/KշVPvkulyQݢI@f4OxˏTDE: W@ۥs Xv4#f4ryT ^^bD jTtRazrz2H頣!AےqKM(=+9s3 Q|nfGxn(OMj?~aA0f1rvn'Px " 9))`_rm4J! yCVhd̘HOz rlSBL1jveZMb8z<1FΖGfhNVH\ :F};re9RɼjKa;&_S򇶼-(YdO:;@$(j!̲y7"䱖[iT$H7GHϭuP8Xp4KKHFǤ"9̰4N+!H&6쎂c`^A cV}hff bjJAx)g8a"n4[WSFqDwשRM0RpbŃsQF*NjQ^ =C$1 &¹X:FHd{ 2pw}v5!*U?Z$K#r&s 1Hϱv[R _$̸ϫ.N" VGX,]?N,v2ɗF"rv.Swfpmwg e}_  ,!IkK f;;i$8_b>W ]qc*AR淺Jyzr}zRI0GS.Y:/|jC6z4V%QteJ>:RacLub<Սn'wl12#| bwq9[vQKm* f m#] CڇѪuчQ0i&wD]&k@yȶQ۞U|68텆G߇#E{p1n>cSBt*`MsYo0]%,?Ӈ}~p:wZO0Re"xFBeiLUOF/ K4wy6/+  |ᛯ_OqmmmiYA{ ǻ\CxT-8S͸"#qdS}%B`݀@_} dFZDvOv\l N :M° hPm=󶍱?Gf gexPQC{NjVgȌ/oO߶Q%{S*[EnGȔeJK7BT#Q3;H²b 0 ј#L(l::A`QZ'lJa^ia0#A2?9HysQ PfI"Oے@:Z"Ǟ|Lj@VS$+W?硝|iC]UbrсБ&` Òvgq.`9VNF7 I{A2@JF/tJO0; $m,=::Cu&" {Cc vYp`*W9\fopA?}%m($}2 K3*_YQ,~&seG۫σN}vܘ"?Hͳ5稚 XjN.TT'[6;'g: _*}Jm/940*(:|Ki&!I2;Tt!HC1@tL_EP/ZP yN()i$P@Y&D4rfbɽ~$Ch&O %ҋ \"HNP\V%{NRv36§w a4s9ijR# P4fNXmsY`>gاk) (33*5p."D r2P|]Yo+F6bsCVDZR,)SKxEigz5=u|U]]ip 91+ pm(ϒ;n1 4<++E͐NTj(gg&RS60ji& K#(ԟi.h60\Ib:YӉ@Mxx+*:|&#x`,2e]TcB N\6]gH*-tP ~'u~JfggϜ+c| P:9ɨIh=Gm֫ E&DT1⨢8*;O`bCL,1-Dc@ZʥpTȄ,I+Bs|9Hz)EEȾg+ ߬=^f14&l1z ؘIs$7Y:'s0F\+t1z?.5' z; Q2Qf ]ɉMppK/GP!!%e%Eb@,"D3 ]n鴛8;nRxQEь>vQ!zU]:q4<įT٬+\9?)\ icIn3M‹E IkV&ڃ* JHwR-95$g(*ۢ7!}ƙXH_rބ?2ؾAq .տo;D &7$ڃ#tlʻ oVÕ gӞH2huםps=qIvs='-59j>u(y!jA]^gTAҮ\)mRTsEfm[`Ad*lr:3BE ZQjW;iZ@9;s{+S<++ 7G0P:G {{>f^ALWOWݤʘ>K/k44 RЌ:({EJVȨ$P"C(Ÿ|4+P˒,RN, p w%2^Y"cNW#`cwin A 'Op/he Nb0jBrEY.)*ᮯ[ײzK{[_庀`%T$33]Y0Le,{g7e@-{¼b2,y9)8/g*ݺƆueUw[:T}rI?M(p?RbyƻF(\5.ːWo~SqG#Xibuyٜ:r8~R;꿎Jij4Ah(T e[zFD܅0u ~i6<Ь.~K!F;⹗$+; M'y>-x) {zOChuoz f/jvwלk&x|?~{ojGF^Ǯ ]|{ޏ椖LS37mھ8O˴+ǣG5 VQ0 qF$@,H1uOv.v6'ж-mI11H]6 6YA&UJdeЍ"q@O.m-h+ACKnv{'~i{[HtN~}W]nr([Q`Ae)(f 2.RĎ9]`JSf<s|D`;cqKY!MNoːd3c8WA B]9%Ex4Ns+_](lC(lO6] D0(~%}<=/1[kO[#&*i g ?da@*{)"hy$NL$~Yz+qk.`oU?tC1=|o\mb/*lmm]GGgNO(qE̎WS'pʃRxt*N"8b >~Y\8;9ޕ4`vKsaNQLg|PͷI23+"JHrDžTRv.Kei90.m! *B0K%I3b``x Vidvz1 R*sK-XR 9 $5SiBR [WLÅJAE0P99E4]Vs^V0_r'K1RD-,5^AaOV=WUhBg$C:͝  6$"s9+3p@3INUՌښc%Y0'҉GYzfZxƔTmXq3J9.B]YNT.e~OKzMUǏpzy#56QKEIiX'VD1˔S圱.8DkZ,3_蹰"#JS/jSh-..st:)956 q<];ڦ6l U6>UL󊠍ItpH@L4h 1s*1&W'\,e,)P]O 6ś*`*}^_REgEm]>ɗ@޸|t9櫿IJ믾اћm~yw{Ӵ[̿SKY_,Nn:9KaFfCҤL#1\( (% 0%S)6d>*/жw* xD[~# _vF;.[S^w=Edjjyd4|ߝd `vy:\; ]#WO,XPY(CL=}_E~.PGu z__G䇥v;:*x&( e |K3*FD|6WWxa P B&s/Ih;t9+|ةyN|Z~%㹧3hj4=\?g>7g_0{Qzfr?~AOȗ'ꕝp=ҡyQrcqx?z7>^Z2JW43Ŝ%S~ ˞g"cqILE+2b ;oFI E-鸘t-.x>of;l;W@[#0OTI7yԆdф )ؔ)]tyEՈ7F! N`]୷ J:&Aq,GG1FΎ[|~6u//{JnAl#Y/~$@``_**ʲrp!)JIkDguepz{uR&{Ֆp89^'_SZm{o4`|g\+0&q<`E-$b"0nc-6H0od\U 4>pp*mxP!o,-Y;"E RrHd 3âҌK`@M5]X/K}|r<8[@̈4eΛnl߯uY禜 oBm-5 [b|qKm͐fjmfY> Pk#F1Ya9tuf?^9`V Z괓mUG>kJ{a*#aȥW_Wb/m'7l6KRBTOѼ_6{uϿ}:/ON_N}t:}~yVo0R~5[W5 w3W/߯+~R% "xn#;OVY:DG11Fgïe1w>RW"HD(i)%%nywDEMlzX`s$Nw{qb0<;K 5 bb$/,hJmJ?z;] +F,BOG"`"MQ0+Cʘt d3ۖmxsWsk_]_y=^#)vm>z,Bvx(0sNnRnd =!"}JX{)) JnHb΂wDðq$$ C'*iVjNNuV<5I)Ť%*ZL㲪N.G.kI Ϻzg♅naG|0+ItQO2as߮&L"3Y:.ǯE5taiCr;MHn'C%Of!)&2O~/5h Ǡ L\xc&c~{ Es2ce]O p5b9yp$MNwL[hj^>{<STJհj$80KI+;yҜND@'}MFnjCc- EUݎ7I Րs atD: K kN8 Z:;S<{&g뫃d76z@3g=gM:b^w{ Usj[ݯSټ_whuF1 iM—W?Kw~CXv8ĬW逰  <Xe컶{S?3QzO 0un -۫37J[͵x$)ȰeZ\r"h S׊xø*b'DJ+ 3H0"d\U 4>pp*խ}PᡑG\ H)ZdNF03,*͸c  B' L;*5C㞅24 YMXuqAՔESΨqE:&x+bu(./_ݖ3Me$Ŋ碌UZO @;v:]oGW~:$@;XఆOE*"_!")Zj4EIfOuOW!\I& X:FHwqu v敄 F~~3R>wl)ܛ~ĩE񎴹b+i}n%SN$I-%ȍ.Gi,9zX!$*0+>^MU ~x: ?/( '|>r1Ş R$]\+HfD3z׬Vzrrz[i.(lՊn9ЀJv9G+KP]9`iֺ7m^̌gkʯ?Ngo Vcԃ%}x:/X ;Hغ;iNVtm۠]{-1bLQ+h,|4\'z|yiap`:亻?r&Yx "1ER_ɥNb[W'ӕ \AVIvc}\Y"h˽wYr LPzS'=X٢@ڧƙ$%LiEv=n^6΄s?s^+}H;t#t$IdҠ @39?I"),h@ؠQk'TFaO6ܹx~Y@͌E<8|3kRR6hCV5Sw _>P)2v$%mkRju+a*g=' M3BG+*&z5*R2WU#g\'cδ!]Q˛]Rb51ThC(8\{{Jz{ffc*+#&WƍмCu\FԍV6eka^v>?G-5!XR\H\RdY)_v0PA=GK , } YtV{ZܺglLݧ|J6&&0h'6%U0'xN؉4Y< qٝpf#;d1[T+9aj `46LM )h Ex`IJH7 X!;X?Y?^X?QKD:Hc - AGZPf,5ɰd=Z'YU%uT%>&e4(A$f"9+j +[?eѹw9;}x".g'>yo{_ٿoV+?s(x|X T3bēD336 . yV*˳:i\It6\YH =f~FnwqNwݑJڝ׆&׆6rmmonjC}jjC7Ԇ®6=UMm.4 1ё6?`hc`(x|c )H iԻPmO mL폡N7%msetIZI%U{ R&A7)(IS3=qClVlgO3"nz ] q'6[UugdzEU1HTyŭ9&R,hQ;= p@ޡNȹF, s$8Pgf6**sp5rn8 h4-,u 5:Ek۠WtŠ壾]^OfO/[%,M3E/h,Tv צ$H| OTSxzK=yٝiU l q6oJP9 :hX4SS%& Zƌ“zCi%* H d#d:^-I|9 zxUJӂkY7VNٍ 陫6h#e7*gc,5`ˋKAbf 82er*$۷ P莴<()aV)l`9$03&D;i6<$kfƛ1p%US&) 1ˆbq+഑;ԶrW#v\UdImgVŸrr-Sۭr+m\LL1hgJI*5S 33YɌ 邕9sqn g[2X A%8!fBV#dIHHB2 a*94B BN܂B&26L&-l[ I22U#gK>ۻLwey .K3בb+FI~"1h@?mT7:Yӱ@MTLPo!wғ4I{#WJob@s:d e!ԪdITm_K!fQ;t8[N9 e"M[W2!GW'R q=٦дxL[*2 U% A(!6+?sVDedjKg+3Xv d0,1[O7.*cw2Ɣ!&-hnLdeأ{,XFgV]+s>G#qXꎨI6ifMByHCBl#gTvYP^|t^RV %@rRFs& P{$Jl2H;Fi s 86YFgcN"[B3Jl"I eKV%!)󫘬ki!/Çgl|cqU=^1)$U`6ZHW)V#/WZ!?kŘŎy0 =TѴC&Y S_C׼WNّ~̚hŰFoefS  $($(N2Iu)4eaQl9,D)p4'4H(`D VzQ:\JZeacJ>=r!XedDw(*WTe|X-#xgQnOA_Xbv5|Z9ϟaoBJ,gB"1z럦In }v9dݣ> O5 8emE2/ (]F -וs%ǕZmⳛӂ?\zAbh(Ph|'Cԥr"&q W,7vI W!r$6Bthm&\1$=Fx:g Xfx;Uk\qOf3}èkywmm$%H}ٜ y!U1EjIʶO"RGxWN*,G ǫmwi[v{X xh|C0GcXiW>nd=jDl2P}ՅÊ=Bkb{sr:hux0OڤF%4St9 q* YQ9с&}2KT"Vz 6uᎨ2DHG4w!Z橏Pnk T:RDE)G/˜TфÂm(UO((\(^ m7ʝG 2\+n`$e(0cHf@F'>ȄN^iH946hvٓZs$/.HC┡H*$X1$R*&TR5c1rvk($g ea](;]xT]Vyzc  6)ԁͷ6Qbx3)dq'Gj0 b%qg3^כ䗅p|% AJ <0"ALjM<2#@dMg16P1@C." jR %Τ Q娀;~У]|gDgsV3~ mI*E*VYYBEpzf9I6' Qnj!$?4iګWjO*!V2rg!p8zDF˘6 tTV[=e; Fn ~;2ⒿwFtӛ=wY) #"q}cU1qDMsIY&7ǿ1I￳88R_# >89\?Ɂ,8K' ;!gJrN?IvYj~l[K->,AkN(Of zixy(NϔOudC =Ưx׻"O׫5ѻwՏ*F y,rqMU[?nV\v{(UsO{ݛWE:\ّ_oW> hؒk?Llnە@iÕ.'w?!M#I8#]7 Ú1qUfpT(<ْDO_[nqTFɦQUb9$ƅGc/};AzyhXH2y߳=߿Bv~~9~|?^P/>_ޣ4z4 gM蒄ҫOn'>-VUļxy]gkhzaYо?-P7N~ki@-BYjUj18(]ehl؞7C7`.=4顠*cPs!{_ q9MB *c\Զrhj4C +w!eYG!م|^$I $kcJ܀9>%L "p{Q;U'EZS 18;  F8X]aT==PHmu!Kn_fJȾqzmiCخL8vDH2h(Y~ fy8K )a_6DA]॓e{!$J3AQId5w߶n{2 o~Na9!5f`)ji JI?$.S*r$EDQ9$YgT%`騔!4KBI5ר"'rV=u;]EvFmZ7 d2ɀh119P*3^*G=5=#TF:()_~+=|*xGiB1 G )׍J&+4uPA'u~҃fm`,5+PRNh#,EkjImEpET1(8ZO L!*L{!hDj jȈGSz& c zs.NED\ 7=icCOŲYH8bXP*޾A(KyCLd87GPRtQN<`򣛻HG19T:La0o1z;Ɇ )U}ʲP9e%&لR 4=V_pRb#uFPArJRF.{p tT+)!9@H(qv2i h)h~HUԉQڨ<.CʋƚK|j/=aۄ]裔4D:1yn¬ɹ_S}Dym:-"sTML4WOyoZ쮱bײ7ο9?1@ęh\?qV?1RIן-' DpLYymJطJD^D@WJ[4̥J"HFlBWXu L7;lbuՖNrT$*ɤ3;mNwH!w=ܗ "i묑9bŒmMYҤxMTU))2̤ۥX&7t|)LfS̢RG:!rKdB%fO[̺,mXYρGP [yE9 y5"n5p=1Z}NTus->Kt|WŠE^6O=|Qo](oW8p! :ϵ\Oset g=FӼcyl-yy٧h-~EYɳۃUha26ief0FaS7I&J+y؝y'pL8!Ky{r<os7uy'Oc?dGwt<]p&J[o\Mp,HsZeg̱Zus `k;!=e;u0fB=V q-?pK3hL&M,5O{QI|0(,,uq0@WiVg2@=1m=s0 t%PbN$x`C 0_VY=JFnɧ%x;|= lFyRE-J" 1 X`6`v@"trnt]RznH)Ǥo˿b 9&7VH-S܀`-sX9p(z5nh M㧑gPĵu"N p^ܗ MKU;-X^k]e:Yĭ,TȭLA;nyV}XT $#-%s7%4KLp$1`q3f"8i%'U2Z%r3J&2`HPVhj gENi.vwv'm^o<JQ2,V,9v[Ww>t932f1&% K_Ʊ| v ~k&~|&?e>]s<?eIu8"Lߍۆs~6Z;E(ӛFI 4$$kY: gR>B>|3 mH](!КDΛǔmn;[Q&y9-,Bcl?Em[qvޛuq~.w{3;LލH}a87e2{=jc G{ݍC.{s_$gգprٮ+ o b%+Y"'9rmz|*|(IP 8Ph-HdgٺkƵIltRI}݃C=pv{d5 }JD|wBN5iʒ RY]w$X#>G%U7r+w'yk9jkbalαl̗ֆWt㬈5T`^kdc"*.`7<(tfv|-`Q*K$8 $CYZm8{a,d'Uv𢌤 -mӆÇ$~Qʦ| _Ze{Us$$GB$,{cUNĠt9S2PeddI+NMJ't4^`GQXZfT f *;4zg9yg@E2]ܗc+P$Wif\Y1Dꏭ{33/LɤVQj%;NuC@tw&8m(?yb4Ҵ;ZɝNf*hRp .&9FZg$ZPI0IAC$a"w Jw RoK,w\.+5zx8jVB\+;"Wi2]Ɖ,I| ޔFE&ӿ|,{0f}x岸=eҿg34gn+f9*L3?oZ*?yڅ;/?/ˊ]ͫ/_-DWfx"%g=Y6%^RP >v}9> [n;}NAJY py:7ON}%~H<mEԑF5^}gSɽ5>H](a׎kBJc2& PAzcTZ<[}i"hb|yF{+D.Zbdq5NKB$ۆuI"I+[U5=ΆmTԁ[2k뭖&V][)#J[M25'x #!>S,m35$W^WͺKˋWzԈ:qnqT]{'K*֫"8o/%\ J!09RV~<^7X z[W/(qmk]}c[q_cfՂiR/sj/WJ| C:_6$HFU ÛvmaYQW_nl oSc WNl$r:H zH()xʹBk1^SפD`@t s:͋j. S^7D9WJ>(A/Nr5׉HkKP$zs rFih+Ah;OL~#bl$U'߾AEPuVqn/ZoOV@SM# 4Y_EYM4py[WfU[mV/G'G<`LONDo>dw''N qz]&:hnk9US;~*9|lJzenk>uuYOno~FIvSB`JxN# -*CU kEZD.sF}XĚN 4i:y@7w4zݪvT=S{˼7/R)!$W{Ë^ Jxs"p@F Ԋh(KJ?f|)BٻYă B)ajU1Ϭ25o_70 >v r2!T_>f,[/vTXWak=]YG6g¡*9 ׳]s:U56z3j)V^UFG}/^{;)˥˭af'pcup9 .ύ&ǿk{32iTv#j?G&jx\b+b MS H6j'cjQ"G kYWtN\[ 6_lv?wۯB+hmĔCd!b43:9 828%x&y|;03Pʡv\hE0y A Bޅ@qD\6zd@!9%+"VCCPr[@JD$1fL&UU&a"قm u2y٤VNy䦼6&'ud,"`l(U4'r;ne>|\Wez@-m'd`_Nu`4yЏզr u6wjaH6DA\K'ʂ`{#pX(MIX%>,}%- E,,g$䍠 @M[BxHji HN.b!T*20O rfq,`H4WJiP2Ihu`5hȰ5NK9;ۊٴF-d2b sBρ4DgTo퉉&wQ'2jՉ@Iz*?Q-?6@IH T#$!"Q>xBH$΢ 'MuV1Wˏ{]IK ,1+@R͜p-Lrj|r!aV'@d|)" W\Y{m̴7Dv_{013ҍ` svVnGƖb G"|Xd[Ӆom!:q>,W*L_ R ^QRX"UI():('3g CbtaMQ6NiT߂&<ڑ(ӂ䁊MqQNV^EW?p RbEohrZ$ ~FzC,{X2uTVb UB2CJ;ii ` fѢQ>KW1@BnA#bT(.RWg4A ^waϮpn@'󰡏s]TmՃ~];S7}7w)-G):hg3`&`jMf8Gg)0֦";ޞd6u0f#}[bNPM¿o4<$s2QӸoM^.x 3qqi%$nyu;xDNƲ(/b㝍6W2a.H d M(F*?$KM Z{-v$LQ&5.Zo+ac,v F/ev|~1泔9xnvlqXn}<;ug@{wb:UQvmzҠ/Ʈ>]q]*/Nk]UKjw{"qSX:IdFN0jryrDyjHR M%A["a^|f^W2-WyJI4lo^ZnW4l1fcvqK>PJd7.^5]53Z{x d/b5T/ՠ|ct ;MFp f'rA/N5>H 06 x_8w.%4NQNqڻ\bH%6J'3h9AZ-ab1Jib w0u# Q7X 0Jg F@,7*6 XSh7~K-fwmmy`yf3 lf/ȒגqbbVKm$]+Xs>G[kxsE)!xϫ,uyj6nz6 zt1lL݋9{ VqNPg.lkXu OP SFi?w9otfs-Yw-+lYʹ}AϷ7w>sVZr>6ͯOqmtv<ط{ V %MȽ^{ǟ7ma|⫻ms<6ȟa|yc|s붿-4TRbXYJ⚣YJ\J})1K-%D ̞_KUw)*K+šUR^ \iA%#GWY zWY\%\@!8eGWYg*X*K ϮFvp*Gcۥӈcu5J뇟\_gcJrJm%dfyH"ڧWm緯!pl x@S%y)C Ff.0v156L ꍝn^!Ƽi7~Ծ?%Ovbuq `(w|y_x@T80mAcHWYqVY"&NB0C BwrRD Km bR) ;I ZBIR no1y"tS* &* ={Fza4QAȜIgtFi`3q;/f+d@6ֻԢf!MB=ns,ėJx6Y]aƓ&FFF+K8LLW WT$$ u^ɥ`luP+sHՑ-vw"}"~Kja2ޱLPk6A.m8/%yٴ;~ҹ\\jV0kcf3UzZa>Y,puٱu=T'\e%Is郕Z) y@FL!MDH<0*ɍSVP|yNzy]h:$/5Ww 2k1wT$գ-=!Jw\BuCǧR6)";j2ԡ|OYꖊ_6XKX1GYʱ6Slf|8rg #Nq@qeU %xp&i\Fe+&GAa(GE2>/i_yph7qUN5_88M4@Yl_M JɶaypX'Wy]!{zeYO6`0ǓxX@I*hEVmxJi 5d-@9Ǵ̲'B160ceJ{%A3tBwIL(J>&QhCAbڱ-j¨;:GnuRZEuI{euНeh]:Lp\*J,Iw @!E{5bM+x1*vw!@<:Q[[k~~4 b10"B"hTh%PMDG# B8N Q$8c%GI"a<=h@8JASh'1[ʲ:40N ֎ȭu4]D2ZR(TRr.rJRRƹIXD>&IUI$%xs24ʊ#BWw6|e2&II$&yPqbtwiDq9lHZMH=l&Y2̪*༔v%՚6b Cb[ʷC*XҮ:sCz=x4G;V;KVYp&* ki: {AdC!V^% =?׭(R,IW8dyΕ'$:+`\/TC\-D7zxx9kQI)UȆz7J/wak oj$zG(ʥ'o o9գ *]\\qC/?RZsԻLֽïu^jr629[! HœۯaOϦc.- ޟbpqVt'$;7݆5@m#bQQ8yL>n=g?Ypwed]tצgU\u73F%_G>w}5FJ"hrTNcY83T?ˇwϔy/p$]P{}8a?l<^ϻpJ?φَ_'?f{~Ee)<kk)K75֚|ـ* -nڟB9vDi]CލktݩG&/}ΖT5J'H 9}mic$rK VF<12.6K;>1OJ/۩ <$RRPGOC*cr 2 hs{udUmq=Y>ǁJ8A.a[u_HH\296lBTEb:m'oۭZ7FY{pvVf/e>A%sPoWTtF*D rrc!HʺhBE,k} EosKνϫ: dmL:gާd D$Ns~ ˦Uͮ='ChW%Dګ\:7|{C|q5KQ\u j\qrўؑܨW~B=W~췂 ֏RRsX#*XӮ3ePz.JE5LIh)*3LXtGaCOt_D½mWI1 ׃#0v<@ +>>cX-1DYeu1XiGjFSV ֢1J*d\HuS]BI(!:8FqcW< ɫ -颃`CEA)0I&ؽB&BDǑY#@'`m"A1qZF]:+߿|'eySυXr9$|:&3xfcIi= y_gs,9_ӷ.+eGL1 bIYnSuh%BJ- z]"<ӱBs7D>)Tl\h|t:>{ k1 P5Ya#ɑ[ؒ,aGX]%mPQZ5gx *M`h2ϛC+*, /D1)X%2Vs5‰J/4ʘR@Ga,I*B=\ @ A36Q  $oYayWu =,l SˉƢF&ã)rBˡrClZ)b-ђ{@dRQ_cV)h$^Р^fS1|l"H̶>Գ ;dX(̀gꇆXE8\,&H2q#'76Dq):H;9,YGlXwvKWZ7[HbzP*@ 2AS(.Rtbn?)~$X o{O #] ww$XnM!m6F}ؑT[Eߤ9d$PXLQz}6Y'=|Y¥X%TjS`M!?PYeEubf%|M|,)C0]QԘ,[VTYzrb*!:Uח%A/k/v F%6P2bJʲ_Mްg$&{s;26Ջ,ށI}`?"m2-L&ƒHu@lN&.{K|z?4 %ϖla \Dޕ qIbmlo 2'Tj Фo8Nd<6ER&`'YL"8o*F ;C2(&^]|:#߬Ӧ܁)z};nJ -VB EKWr{}WK јq{ VW&6yTX4\(}M ;_B ljܷ~!e*(~JY(SSv$訣^RP!J"cAO u<j@RBPvSmNҠ\s,,[j6V63Xr>Vb#嶠p/(r='+on }ﶹ[388=aNk{@[|j̊GTQ;;!K][C{\h!e|x틩)㈔[-*rSgr}'Jw 7 9{S;9J#2>>p3#(buZU3#Jr RY񩡽]Ƨ7qJorh\ἷz<_}tz!OCIRֻ;- z-B-e꺪?nuQìqI%L2߉PKr8ơ!0y)fKyu)faawwyo[Vm0&5{B0 J,c7' Q\6EFzj Zi// NK1úcVUsw}}Y|CI/W6l{myt|mX1S;p5̾p(/.ܸB7@ɭ][5t"yvC_U:Lѵ[O^o⅓޵yycֻ<崲Ot/|5ýÙn@n_/]l^nio‰WYDG=-Ls}릻nx]mR͏8/Dm2m]f2kY|6gm>kY|6O--زzO}StRKnD %!{OS(&S0ޙ a.s:*QZ.gȐu5R)>gk 2dcO v[ ͷdhGmj e53Xsٽ]v]ξгBh⃁Qڈۙi hVY85ҧh3H!H)|pȱ؉m)TEI\tTyf]ulXGO\1pª"f>VbWUL=a էEl [Vв=\k >Z'}g3ݾݧS7t?zm#LމGpw'Bn2ǒp6CO{qN&\p=vklH|{SYFd7T)pt=a*jJO S7Adگ|Kߓ{0|]߆k֌ 3w|8k_~ 7PjМM՛ܬřbZc|09$t,SVUHU[9Mr>R2֝}Z&PWOCU0;yUG;D&3_.Qy`Ն0E[se.f)!G:Nc1BNҚ97)p9S$ ,J|6" RqXw^F,]pe a0Œ{G@X1f}}Qom]s{1b+)Ax)YSUT5X5b)cEekZ^Y[`g/ʅҜDλR@Zrؒ3 Wpú2by{ .YkQ; F4`Yrz4!gUΘRr֩{$ OdlEJmPUsRil*8vG(PLUQͩ(>;/Rԯu`<DfE<3"nxFH&%fVɲi!7S "XÞaljRhHA1-+ZRpqzs]H$j]iTwv(Qb:Ykq"θ8u5fT 5:;UC%6 @R߼AZZVwf2>pPpu~a32>7GoSv*TCmMgE֋3Iӏtz= Ѹ 6LyʜLlDk"攅PBwmsd jJ=R4r_%'26xpҾgsw^fNJ#>ą[9`'[y-3nIr}(m$91WSEĻ&OVQ(xA .8 @i{ .(L$u%{&"aq:]^]|_І5UvU\|RIG⫓6g#rx,FA|So6JI-bSן\ZFW9Ɋ XfD "t[^ 300_A ⦘0Q~` & [B%t`SXL2PA,Nl"'R-u<ܰ hF@isHo/\|ې cx"?o6\ʹj]|pDuI\u"T VKjq8ArHb$(eg \\]i3ly>dfb d5'K_ձyYQ2y O륞u$ A?ǧwFzk Vjrf %ǎT8BVcC7 ^?'8YS >+A:zSDF+zzwr_݁W~>XU{0.`dBP KAl70ZL}DkmH;Rߪ/=6qWE*VE$MJ4Ք1% bj_#}l4 Ts%դ =3Bf80 r9BO{{zؽfT0{$N<lpg2|&&7ؿєMfU3k_ռS .LD }lKinaP{8hiJGSo2V+cJ3ɼh'1.8gf8jؾ5 9 lP,E%HB8єAK# x4ZWDe1ɦKƾ Xx,|aqauZ3/繞<'J(~k.YztjZ繴&YPZfe"AsYf?kwݯ;o'مwl19Ȃ+Otl%GZIVw^lkI- նff2|@A ?u`ż@^ut`k꼓ZW¢Qzs C_=J?'3M|Ky_ӗu߽@vo?~8wϟޞޟeޝ}xW?Džd[]P[]Pzu8im?|}};ߨy7Nͫa~W8W]~=0>,!O7 Pilo4b-ـҮ,-}]PPb? /}ލ}O[ӽFZE_|Qk}e8U R@Ҭ ;s0`_ٶ?'بu<@aŏߐ 5>H$ :x mT6XiLI!juOkª?lxXv9) byF{+Z.ZjqZk"5Ka&TMFE㉽Pvr]lE5NB*Dպ@墶T3 7xX_BMiYsop2$FEc YSFHM )`2f"޻SEqRZ5!4q`wIt URӮ0N*F7phնhG>q彇S"1* `Cc \M`(tyY~; @ ){%@ P**IQy'Ä$I)2J_x] Һzdg5}mˊrᲈ_.6Z Osyy?s~suǶͿN}vгl;1r/SY4bSPe 䨚NXji]S.-z&H%(ۮ@e2;]6^ ՍqW%x UPn ,-j7^0 ދθ(UV))Ogp'$˞8/"D8EsdS֖4;Hf|/.7޴ZӃή_o|d|1+KV\X9BmxO4J ׁQRhAXPC9m,+PHhY#T7:V[yI{ST)KnQ'#a ވ9-lqh-}O1+Z,Ke??ߢw}Q’D%~Yn*$+xpR,ɃyQS͢Y/Bx6?fC3ݸ^xcчc)ipY,h縐hNM 0)yH94(#kn:Eh do$G ÛR%L̮m L&dfɄUo9˃y6]ó@;Ðg dbgpWW7~+$PtO8, yv dG.'_~q*~XjPqTNL8F)#eq-gB¶{5}z _u\B YB^GԃW & =۷?;]?a<޻ Ke 3mwׯםoWgw~Mλy\׷;ɰ?*@?tަ4}Y'ZjNO:ȍd M7z-e=.؝ %ޅH6 $A QG&D'MK+xSE$% XMI ^{$BAJ4-JڢGQvS(J AY|#:rQhg1)7r Rq*/)WCt}Mחn%Td.#AMۮ-'U+ †% ^:)Q6\yoI@i4'C <%Z5@M$]Xh~뜩(ji JI/)R*rH@Q9$Yg1KRp\Y-NWs!rb /,gQe_1QsE$IBLt2 ZL ΁Ci4xNO&'O2jՊ@IVwLDcdC &S>xC ԓIG$MG`ZU jqZ;i-RS3%KJR ȥ`'hIm!8@UL8(FeZ+H`L{!hDj j.}ˆǥ:ENpXp Zs.VyD3~Ɖ5EȾ§ |ԧUFxhZDOt""g YpDNY!ڴ[DxgI.4=H2lxLMOӓЦyizTY̷gR%jzF۬>D @JYHU-l9m9Fl9ɍ\zRh5* 4м5.HBAJmrEw;QօwsoY[6~XS̜fN9\Q:1ʙQ噡 MA[*_ŽMF/0Cu߀0/uM\*tPm7L\YjnvKj{W#r =W[2cq!6>#v]mRQJLG2Bj١12+@HyD iF]erld"BUV]@uV\13ǣVMdUV]@ueK~ڐ(*~U,%D|y["x9 .'b qU!F':BldrN?E8FlGkiY& NW"[\ crpE1ʔ9I9$t`^}ʵD~LA?t/03FL^녩7bGo7ŏ,8*N;a%7I2mԄ:R-{Ubx:fo2v-'p|)Vq`X'UU>ݵk 0!-dXf8F&)O2P&* $CwTGy բ[P!OM,&8d00<yHFS]˜6FXks9׊!]Nbf[9j h//fʺyjl-?%e̶]DHB"l הQ )`AfV)v.nl\HT,hb(dS *D^kFQ*ASw|'zz|y~~Um^~bk$ېIƧm+<##oOU>wkH ɟG UQ99)y19ӡ5Iɓ\M+qD52[㫏GϦ\ݪtt zFE#Vo/n+\~ה)WEV캥|hz`&h_ 5yqY{!Ǚ3'TEiI;fq@+€d#{e0 mН'=nK%hTf foԇ Ў.30 ~j~Am_JVg?QTJ^zr[E=2%q^p(`8geB{% $kph Y[_&a/i&<3Dz> Kn2&ӱ]HT U[)$U/8Q|R1}~m؃9[ZDzr\ek+ S} |cnk gMATX*99v%20S"Sm*_-VR$ōۏv4~E^}F흦DDTMв]S< LojD/^#"WסxO eB  U$g݆5ՏWf %U>~ ]JS ,c{Xr&en ΎO=`lHE;IȮA~ IXS<}gww$MiadtmH|EB˜)7N?07&FF{cQǟ!!TkA`JāQ -|v͉nxjNNp=f^WÄW Z&-||qte2Sr93ۣV6Dn!ж#AUyecDQx8 ؆H 9P,!a X+H1 1ږlMb_ւF1!ŒU}DՀ:cl*MU3{yw pg5cGD$@dwca"՛L?*?]Fѻ7<֠}֐lf-ɐ1)ư(e5ыQ#cxss#D]B9 ̩Qu  c>gTV);%jdx,9892Y ]{zk;/o6?]͞U+s*"'MP 6@C'ɯD]' hQ+PDY~Ļ 8P7}2={ x}p5>oه}(=%fXbv'E3gp:S[`R̃84TJ<kHic @i .>ĵrEm2"ڜIŁx uIYrR[ɗKqR2@)2S@7[)YFɑ0eju g@ڜ5z~q~ym||//o>YnŪqo|E-}MU4eϷ{w}~q5sMtpq}Anڤe!N/mn[\zoYtMtXgsڡ04?|=,u}O7g^?62/h{׈{{:ܺPjb)_4ݶŚXUny`cĎƧeZPAeL8]SNˤ{7 s!SCژ{W9;svVT܉;s#-s EEA\eBƜLeG#~0g7J-#uWix.Gѕk[Wr-{{tt2S+(߽[׳5ob0Q|˾P-kL㕍Fq7i=(XkoHKvr~Z(KVo0;+ecjjuZL&)2 U[h%U>V,%Y %M9Z\PNp 2et`M )he(xyZ} `{$o׷-fӶ GLPmNxy-e&!b/=B'`խ(T%=$IeUOuvb̎wHF0RvvKY3Ë^3 }.^còb&k UPSUTFnR!XmZq[Bgb'18QJڲ竅[.QFϢ7^ 8W!uo6-Z>\N J-sDރIٲR=< Eݾj}SUGD4WB >UǢmSYlB},kΕY !W-VJs,9j/ف`Mbz[nù2nF)b IƮun{72>ֳL|ɟ5/[@ iG>yN<*um-w*,HКž0`,TVȬQ7'gؙ5e_S6T Jmw sy(Vtj}gG="؍E $_wEd*>kTE&%/s^s[ȽZc`,x [JN(6Jވ*S]|Mm8éS(| XnE -b-h7I|!3x,PMNT0*т2UXUD!R[Py#qD76;ÙS0ROZуI-7ElR!:Iɮv:Eh7RFOD-PYFڹm(2b!RN1|m 8SGx(thWDz]u,"_eMYqc:3C+{:d>̨0Ų$ ?>UGlʃ P<"@YY`RLP 0}5&q ȠHU.>Qc$ b\u6" ԗ9%\ =ܰЇT L^N q&C+%lxrR _p k ˛c1O4]#b qJA/:BlNj6EWu龋Ś=kD=2+U)>5ڐTN#$u*toT5gnJnzǼfahr;,"0&Z\IJRtUv.(ePU8ZX&s1 h0DUV.jqn |t o΅\*ĤN~bqm;WnKe"H>}ܢwmHL&W;_\:s꒽T / eԒ5C9MAmÉe3z0Ӎ~2vw ?,Q>usȆGER O,Stl`Q:$(U O3]jzMO<_!bP$'h&Sn/Ӿp34BDr$g,$ W)-WTI(uB)H/eo>qwʬ(6C6OޝYAs 9F =_GW~[iƛ1iMrzqӒVxGըQn!o474"N0hs͏^Ȁ0ʊ=5`U;$B&][ /.ohwn<ԇC҇2oyKZۼޥy[eonoj):#Mk&PL7:$+ɪ5SR.+<I-`E(&گRIim93ryPr99187ةv;.F͋dt9n9bW(yw p1vgEKTy1'}Ӥ^αޱO;eck^p_5[zpi4ttMuo[b;z܆uZ:!z S,x%Q'\xt2Ө//ОR}x)YQ*tPJQ ĪDZ$FyZ5zE^!s8ƾ~~Iq۪Ϙ<|QE{te4: Z%+3MK묔@!Q 7\wVQ#/ ЉKWSP1vԸ763s *V"4EKb(#o5&>{ڵ͋,(vg\ l>,/}p iw91^F U;N8Id{*5,:!7|jB GPW"I󵹎a|5ϵnTi.)mA'mp"+dFS΀CEe-.AM(0a?DUTVʺp}!Rq-縢 $9h<:dϯ'.M4T^!hNNU}"1Qφf.&QE,U*^JÍ (^1 s1쟍}U%\Ycr/GDNᲙ90$7y&o̼}!.P⍔xjvSj%̹c4Ibl0k"j!Y^"FL;#Ȓuq:_/zStC: )W,u,䦯hR#6Gv,7-*\46,.7atq/䇗߿!|ɿ~w?>̜˓ׯU/8pa8z.FW"t(^di~I~4yzQՅns(Y&˯4D΍ɡ7180òEq>^{R[v͍ءkMK>u?J>@ٯPؙb? P(oӽGDyﲿw O&fp@h6˝%}qt`7I6jG2c ڲo37?i:J?)}N]8!*k2)''a0VSRHeΠBu;:mb]O`\L= 9 >Vهvs?)Z5;D$SdC@YnJ{]bγĩgsx.?K+@+umͼ{^ KgcgzkṞ7ݷ_tKwh舮㲃Z^m;W|a|c,aW &WUjޛnԊy|ll1oǫWO]!K̥rJyz r^ΘV"[Ƶ^ )DTDY-;%#v*!{\$FO\kDr,bRYǓHAJ e.IBJ&32ir{]ԭhi GcM܃ Qy "$jt7[WLar =g<- OؔY|Qz}]Wuű:sO14T>:#w:!)H >.JD{9I6jVԷk'Z6>Y& HڝV(}2O1ºg'g }jnI\o=7oga?v5AЌLK//GBGMmwoh^sv2}?Egٷ^?vü ; Isߏ WՄZ/^Xjm݀S.R=RT>r#W>r#W>r#W>r%}U>rM9|Gӿ++++++++:u\ȕ\ȕ\5j|+fq^ ~W ~W ~WQ׆T_IW ~W ~W ~RR)amyH.MP,D Bk=VÖhAT268942IO%WAb#ZDw6I $fR)HS&4렌Bмꑼ29!0b T{)pJs*wpKsw\,}SIp}>Aq YKGgCn.K5DENL: y`UEDPɻ_PWLa[䲃z0iN$/8bXux||Z{םDplIj%zSkP W;la_cj~4v-_9ePM(pEn㭟fHH$[hx9siXkn%7=- "`2-""є8VK" X2frIG\)ѢȎj*(4$idZa"5gR əD&Y8 O^axGYW<+]NZf ozYK*/l4i#owK,74(J<|:pgr9,A$4v3.%eKuIlT%20 lJ{VTzߡ8w㳛V耀ۓ0H>x;ǙyqZ"#8g|bCßkI`>O}>wv-V뫇c7V B4r"3D+ VZ"iA2 HB5w 1)8Ig':t5˜񖎻w%-]﬿[O>6 սww+! .@ UPag0ˡk\Oq:u'[3(UV))ۡqr'ϣI("?ocL0cco6~sͬu4oxyn4]u+4PYpx5@yI#T+t\'5OŬ-M 2爷Ǽ"9P*5T[V[X=)ZcK8'*y8&xlW-pNQ5c,|ܻ}<( 1RnD2%)fST.1,䁘½Q*gC,=l7p_^oQ9s7>/9ﲖع~q[h`s%)T0 B2!`upHȚdF3wSf#Vy% # z( 3U=`ŪU;㭨\* 0;-~ui'NQfU@]J.*ԦDm)'^HUزJT`)J&ijb Sp[ '֒2t|eUZv?LnŔiu޾hZ*ӌִI^+HW#]^o}1^+l]asjz}x; >\7fݬ;ĖoݾuEO7wFmBׯ].tp<R=}uC wy&xgWF5?o{KfWm9݊n\}uBNo"ثֈVIW: h;\D*pFVƆT- 9gw+.'He%Eru+d:kAkwK 1PK8v1A x(U!2@Ku{(U{'F?ܿ]ޓ&FPc\kOr)˯/F.[6_m~l> ZefYe} ~\6T| k!<=  /ڥu/.zH`w􌄳Fp6 Dpv gg+wP8B8;1ī qN ѹUTi8O>^[yU$VTqOL3Ub9ϙSHK:{~X΁(FG+H[C#8x8S'Bs(Q2ǽVخN:N~\_g??䰠|Z:~*lmLt7|L_rj8WϳaZ~Hˈ%*+ Dd8ͯQ#8Q$ oABSAКSDpJ blj\˧cXsOl/>,ozN]bzA"F5n;rߟ|V{)f$1INчF2etz#!1+͓c6hZs$xK!P}.B2N[,ۄ U*Űg슅0 K;:sd"+<;eaa7֠Kx4Hh"Ve &ˆ0Fi Fkh ,̖U cɞ s XMK$r;MbDn)D2J*efvAXżcWVQ[uEkKpUJFuA62BB:א21?q.0rsdRYﴠ2F#bMKG(D IuTd1s6aM2OǮ #CĵRF@D+4U4Mt4 eg$B$FG[_HőG,eqyͣ2G&͸(9t!RlF/C\w/\g1/Ma\4.vY. # y"0dCV$RT0)ʻ1 G۠q3NCa13>JŠa%^bMIK_)QO]>Oo9H7_}pPMg{7ĝַ_^\OxO,N- R{aa{ۛפ_w5ϦF7F~^ ?Ck 3H֮<2ABr9GxՍ}>r}x/#wI7gt.[ei=%w]4 QW^kfE4[Cf|؇,d5QE w^\&㈈}NKbc%1pvQA9@J9jRf'=hǽJ9:J IJlEgCᴈtP!i2Ia8hcfJ A$!JʘĩL{lz.URz*riT L@1+ YJ2+VTkٗמ퀡#1BD ɍ%eI:)uHHII@O }\D yu5J88Ia;G\/YN/=W̜-J:XOO0[^Z3v`ͅ-u} 6TJ]bU!fb?V w,hIC);)<:t =5!c\z>gqG:zVǁ @ bdC"ȓuﯞNw8._9~s(PJGv &8 EL L83R7Z{ lF}Q4~JNd׃u9h% m2껂m6p^y{# K%@<.@SxRĈ(ulKKr6d<`@R!QDmL.%H`U8^b\e´}N9K 2(|uy<1 \~6iOzqO?3߇K]Vcsmjxy3H]O.ߞ#I1`@=pm?KqPװO9YvMnxomQbsH9[mӽvl~g!.ªѿ/≶u f/Si[<0ꤐ Knme ?hvK[*w)i6gF>ozhKw2C07U@-Wj&$&Byp3Aw͍` t}`^Ƣ-;6L6pW:w VѦMj&-7iٽ7ȨkF?dzf5~ ϏUbtn"4N~VUw^+zx2axky3#_~ ~Xk;;Rѧq:m4$Z-m]-?뽣?v^3A]5U#i{voˏUN\W'n?#^0Ȏy8~jF>8O[im6Wc#?۩#V$ir/@Bc(Rfiu [$(f{WB-e.z#^drѨ0~n4|?!=_ތytRYtXъ⺀@meȊi U:V?*]ThwRVNv^v9UIր^R1f9 2rSD䢟qOG(tByEmdM6>c-b>4_u*`N:4FXM^v+MPW7@ dyp^TŹTQ0P4R9,ԉ'TJDY(#҃K!`0<+j˙9V~gr5^X^GFBSIH;MHJZ< F< (;I1{BtCewJRۓ՛)~lz$8 @e狒ZKb[/K |]Y mkN+^w(=K kKUOBHzOlKzd}otۢ`@Iw 4yc*:x {>Q\3s.)* MxId-bbos.I3SCϵ.(ϵz;&-IۘuvSu~n}-f/&lkԲv>bJXpɢ+RS%z#Te (ꠈD GQJ@ 2dmhc ;T|T SFЁDkp3vk8d2+=Y|eO1w2wg˳',43j?],nx4qyjgj!PG/5XhućW5H;` o"vUdaD/&yUcl F'7,|haiGk4~=::|ߧUGW:dܗVzYf.Ҧ(V#Z2{V}{a#c9~i.Bz"m HWrBTTPQ*k0:!S 9~ڒqq>s :O9/Ê+NAA:+1RJ&hr쐭X(IMr:( Ÿ6|⼐<|1\߯l%SpLGm gU,Z#RQ"`DkxB!P)td NKÿ\3vjCfnmMr 6Vx~.yPfhV]/)o|o}}[rm7f|=||PׄNw2T5E .0j#Z}, ^]>v3`D=,S]ʁ )M*F= u- ٩ʛ줏.k_)NMVB g#Nѯ|vIZl2d-z֌lWށrt pT0$FEY*["#( #8 QbFT |{/4W$]9%6$eö9abSI`1:DD4@fP+F4Ik=m%H8^F IgvȲ&f0M Yulk&Gul/ʵضGx?>d,L2KD:'Ma J@q^_a޾v󃞞5c7Vն!D21q.aėB9 2D2Χ\Lʠd+ցԣzX'w?E;Iȱځ?"iXc}<Cοz7e$j* a6y`Cu筇Oo)X3ZL#BR(24DIgB<'P6&ZI%Xgq X%Іѡ X`lk!PίaUq~v_QaKO/reص1(|&3@Hl.QVc*`ʒ33X!<Dy흔\^ܙYHr>1_} nE?odYٱ&N+g1# cMv^Qgu4*IvwFYTSYVwH!u%yJXX JJq,,AaGo'%%i#F3BM)Y h ti s@ .Pg⟫3yl#ɤ4{&Rt:EqޛBR*H! P$6hmǶFZ!j 9b-PLclT"J轏E9UХuQ3vl- }N^:>ȕ=//:d~|)k&n+̰t[VTn{ʥáRVOSuSZE֝|qݡThddDߘ1~WVnV v>崴J)rv[QRVet$@(, CS4UOJ`[XvLYfqn7[R_LO}+h3xlDXē4#y1&şߩ`J5j%@@S"+""TY>"ӃaH!HD AsWvSa\DBeD ~HSnGmH>^5䞦6gi +}d?Ȑ})Ȋ I Zw&<&GPR<v>nAQ0@1C jq0K D5H%z.TJ;?k4Uo"?i}.gy Ç.k6SCM~ZLFſ^EuG5«|m~.~{ f0vF'GR*Zu`aCcӖ{ |Bz)c# 6t +IN|I/kP!ouxt@nȓz JӲAYG.zQc!-Ðw~{c]0:|P₱(m<ڀOt*/f]\lo wO}~lg|3|S@yBp=>m:}>lſGn mOCv퀴^ iY%< iN'BYVu8dLe˖l ҔDY®((E.u+`y $>Gmc x {L2kdRf.EZSt;]Ҁ`%Bs T$+c>׏맺f=m?xm~zO܈.y\B]Q[Qf.djDERQ$QCZOs};Sv*یX|Ƀ {k fɅXRdH#P>?]t=&k{lW{r m.j,cD1t(6+Vme7mIfVA P0f[Vw)ĕXIV{o$M𱱢6cgZ!gX7Y<ЏVrrlmaϥGfj56;zh$˰RALVF3Oʺ%G!eJJHo Tlz2᭫T 9RLp9jτPHoafdk -<=mX4ZqOV?q6ŀ.F[l%9/gOO2%2kC]P":1jq#E/YӀq%op!X-$9{>,ZY([rmZZ 3YeeiHh Ll`c2lT^r'yPm&rL9&QZ)qFl9+]lvڪ0j[ vٌGī$8@D !H^C:P#Nv\2I&5(R$kIHeӂrdhR,CT5!/YdK4r{#xXL&ӣKw͏uQFD"bf|4R ZoԂd$ II(H$jzo}QD4"GnjP)Y8GeX{LqVQs[aD,&È,viR]u%)f:B'pP !" i$bIT>d8"8Ւfǚx8Q'j<=ih=^#6$QA+яOhdP{<q[ѭൂV,8j +505>IPstL68GlU?΃L(E-. NϝV;eL  R1qV̜֜ZyrfI¥G~ؿذy=DWc~ǫ9YkD{+mV#sT^D"!H%8ʃ,X.Ӊ| hS["Tz(Ml|n?;7 ?12xddJ H+*)MI.AGjDQ<[;!c=p~,.Ae5O+$$%\_@|zI18eЎK$O2=.M4UO*e0&@lj^19 $2ZƄ'KHei RbUyy㒿wHtw=G6#!: 8$oM y|}0λ?].+);/q;wd qiC daP ~P?apׇa<7o-I|ɕ'$:d\\IX\hk?ݷyި.Cܷ!Sw1^w.m $Q‘=6y^DGˣ*FG_.lrE]*M?WV)Uwnirwg|{v3⯵7W捳hL Kgs3s"j_FuWg7;4$%.t knf'[T{plί=i{-霤͝cdyN6WYü:RWG_ R'{ď!Ǜp,?J?_Nǿ^~C~ޞRO_7/q ( _DF@/W_^簲&Xϻ7ptWìG ӯtEj]~Ɓ&kW޴M+4h^O|ve.7ro_B4jXpXjkܵٲ9USpA~h=Yށz6m}xAQGODk^}~`g?C/o4D](kǵQ!`19GKը%u"aUL@S=V:d.ױ+S$Ej+HaM"P;Up؜X+ u虭JqVԎ^vt:`-x'bSrmnR48![ߣ㲵~ ϓq8U W}coEK7X krV 6#o?HZF,4g3)#ԋZ2I%++Uk=[̂N[^Lbx:?K(袭-ɭ㆚/GxA ˋ)k7frav7::Itw?\l60}Qӫq+~gt5 { t^Sg\ljaI{.Avp6vD?uwz$RdkyЧ|@iFD,Tu9:gkVemB m]5i#H3A oP2$ɘx6KMI)`2cg<М© UygPPv.ȗo|S[C|t$y̳=\SϦ|yű%sOD).sy5\0LPQUEVq 2Jk|2[IϣHL!QO0O]',|>߳j{ KikR=ITJ&?bo("l_Y9dzqNSn  \eqBiQwp •`l'p6c]@ԾUv|vj*K u-2A r* l'X zQ9҂uRERLP{W(0'c]eq҂uRo4GpV7pt_*Kkv tշW8W(0clo*Uvǒ6+Hj&J2s\ DJu.`%^>g'7Ws" ^uӗS#Mğ4QUW)Bbœʀ SQ8߻!uD :g9X"'V$\(5G! mm"8)!95HSx oEBPDI h併xLd`{{9;l|]S\.ǫۋ?ƀ9 {űqڎ,~Ho37ǡSzugRߙI#Ԇ󠥗6FdbNLH0 )G>Oۘ\J,M&2D'ƃ?ܕZ9μ u(|㨓f{^t۽wc'C ^#f9̥Z}bB}|`ose([i#C^r 갲89?*{#+5TW9XEPg8ns0tt֣>opˇOր^R1f9oL9PNi@3+ = u$!\>f2!p.<#:˟9SRHb<|. uk5viiǫ!ᖜfx]Aˁsf7]'"J+>DMnbdV+MPWtnTC򐏯Ǡ*Ν"w3R9i mԥm#^]gc.]L#O%KyH0Rh -/h.g AJmvF[ EmDƙ|<9SҘjMT&2 h5D!b1mb⬘Ouଯ./$?n.p4ќ $_{O}9 ;͜rDx!t.FDN3L:* MBji4hK%+JLO'!p 40i>0L"P 4<0dD8+ꚤkXu@vٔQnz x*i n?Yjoi2ʷ]6sQgM0{`EW}2EJ Ph)wv:Th[2`Ds]K5JUy )x.= ,<(rvMv1@d/S4Xl3gysͬr4~>rhn'W=+m_zǣ{@HٗZu¨DQxj4',Fn41Ԗ9Ge5* Ҙ\]o5 K3J1&J<58kB;Wk{_-\-8EM2Y>Ads<˶;u)Ot`0 'K)f9LT.1,j䖘>S*anŪԹcn7eVh3Ke5[.Kv-o捓x?'DzD<F;\"BN 0)!$+ʈ FX,^gx1t}*U쭜׋ ږ<J?}vX ڿN\%@WIk@Iv,=оxy v'>j<[7xB`?lA3jyoe{yF+W.M!Iƙ|x!G(Ta0Xu`kfCHqˣ`#EbwΚ`R!Ya )v> R 8jXY:=V)n[EĖP }/h Î/P DKsͫ~.ӼO|wMFp򒳞.>^Xc?VjL`z^0g|EDM(0}#-1dAWmyJia 0a$8Ej02j7Y` AHRVa S띱Vc&qJуhj4"egّ !G @S*j6ל lZ:m9 WERvkG%3!f"w=U2;sDdU+S|P&͢KUbxuBuOfНryI{- hY4uw3ɋۅ7L5;H^jy<$nyͱwq-jAR_Epóu aH$lomkYǻ+V(y P]X^°^Y=|$! Q'~2TI] UqVc*No!U1L){DZ~z\M)eGl"A%(h>%nyc_Ue;Ga@=t`-RБH HcBNqB(W#%Nc'%"rub/|դtLl>FCyO zֶrXk^)6}*p!1YEy= D ħ#Vd$HIЦi,tc0H1( E4"sX! d3g5a( 1ߦez{9NnI] H]59OK|؞a֭QJia6^?Y#ađzg`nS?of\&Qڄ^}Jan?mPכQ(vItݿTKJ'^`yNKs2if\H{웿o@e}/J%=s]Anrr2WtrstU}9 ݞFb5$8y;f8-{?Wא[f4;߉ nÑ\AO70FRwvtL92ŁUh'qlHYmw]ׯ6 RYttNF;0eͅA6-aAבݑ=c'RbpR"wrnxQ} (H #5FI1*W"y \f:gyZYVJ3VdH{),~]ws 3H'~CZa d@Z8']@\siqzyQY t%J0nS e2£UON>gWG&M МWbv]j5q5L J-Ѣ"MY0&{jLw$إWa@_B_0P!fH 6HFZ{1l',TDs0XP5-0M/x0viL_|`z5trڻ}ף%%3lv_@wG ruoߧ7lM/Ɖh4 m>Zܶ]hW~>H두E`(( ^{ -c1P+8πx LqY,C"NcʠzU֓\yn)`- JZGK!DL8P=l:FY^1Tk%i=ʃ>q_&鞓Y'>wjcwM-Cf=#X#q肪ŒzOe3@;n_Kg 8)Ц‘YGIs>SZ.Ǯ㲗.[A:D+*PJL Z:ꀃƥؠ1c_g5w}D5=U{ +L[C'X:R! 1rRy `ڄD AZX,vGE38`$PNbXXH̆M6bZm[,(R?r}S?x~,:OS6WށC8煮B3 cL˱S.L#1pL鈅94h-J+BD 2#bdF/cL'3fSgġUr52"//<ƢMBίf4 WhH*dDJ\@$JMS2VDqNZԢ0xPHQk %mR!`V0/2Bl4D!"pRgǶ_ 1kgzmkkvӈldR1i ^pڨ&Ƽd iU3bQA9AQNpXG VA}u)9cǾQeG4cI Kw:XgRV@j-0 4p#jAl8X)b%#1S, 9 H\Jp#-Ų%e8ld_3E/nd:p\EjDCAH DC{!`N2mo Ox'qFQ%Ag?>R#ycCG?|) {;xkpi ]75?$\[)15Ebɔon"tEb"GLo0%W ep3 S=d~ڶ{1XhYYbx*Ĉ 쬂+ATe8 _l@'$Swh\"[gM ylP*rl!1ZY[r=;0\3 ՠ&?Vnomxq>- `4n'7 tt@y==fOng4E=tѐXĠm8ƅ&F  .6h.Yu{w fHc#Um c_v,' 3_b[YJoA!'E<մ\͹RH B^:ǐU!*6eL lYW+rE@,LUE=U1kQpH>%'K^ȦΎq~4sH ]S+|YȵiOoһ"W޵u#"˶/ؼ o!Imt[4)60x%W8E;Ʃh3Ә eLȔ{?u?tCo W ϡ) SЫ{/_^u^31^ ͕v^ uC+L2N 6ڴoZ~\wpe(47ikXiEKz<+KvyA?^m̆ڟB4)2!q|a[ΖթR H G2ryc~`9VD<1"ʰn]w)5m|όrk}vD#'Q!`1[ju;2Aª۸SV 8A]hoCkt`Z2AkMRF} LsUj㉍eQv^1ַQDMOtzOJ :`8-4 @(WB S3 VTu@DΑ2A35A2 ?WlݠT I Rp |^$Hh $kcJ܀9>%LVDB4GplJ֓Hzٶ]A;'!~P<ݞ~|Z.SOpqlGҎ 5>B>v$xN %%E]Xdp*T^ka`E$=$1<#5̣7EBG;ّl{>wv,ނQG f;t ֌}L.]l}L}gxQlv||Oio l7ޗm7*k:Ǝn,5NpȚo,br~-8Hann8Z:GD,}vƛic}^2/V?\+;.,KV ˖` 6'ޝ&i&~'@Ng; $R{X#+cC2<PwY͊s]GՙN'HYזd'f.;+wέ[jb#~&]+$fr`\ejl;\e*U|4mpe;P5S b*S)iWRv+;U&]LWJd+$݁+-9!jWJ2B5p Ȝ|HBrc06[o "J#\) :: 0[g?l/k_KF/gA r4 Oj֗CX{I&j#WXA2܄|əQ8'ߛܧ{\!g8H()N_|aRZSM$g'$$paQ $(f#՚;|qFؙG_os:hE+"R]>*/w0vMNi$5YOB$781g[3˺%n(뭨P'0xym̳2d)h)QQd.%89͎aQ%&x $^s']N,Jƺ%T6F)D;MMAΖsD"jT<YíN*dY38)&Q<3T?lغ|_r̮z _uD3"6XP\\uD )_d\CKWX.]J/;XQ̊>m|4 d%1_VǛkov=WBi򹑑ݸ>Մi 효?WTEW߱?mi-ӴiZ@艛ֵNPFzRZuyLvAײ$at\`RzD#w&@()Q P)1Z]u㞀85wogm}SWoz7ynQIՓqz/6T ۩O]F,0BĆG7 `H3xVy< $S=G?dqm~0nFbQD#MLN&t;-L2F7Z !5~0`@njYw vَӳ9K@ FqC`TyNtqeރB{M"2m&k1Q"i 2O}B\k҉"ڗNTՂuTKc36,'TL9o |ޫ3Nm*Nm$e(0c;2:AeL䕆5F(RBroQE!DC q,tJ`U Jn)`Tid,Fjd,b)X(,W,"=<Ϛ$^9qzTNs=>qF0#h)  Ҫ X0"Qxj-rVeEkPس,&Kpb&Jh-BFa.Fjvn31qǦ jP`q<&^%dD^WFHAZ <" I 57b.ŢxT;-(!= &D%L{#b>.$G:*k a1rVam>~bmAb)"ˆDlqG#UG+4U4A&:Ip|#)<  #S/FSBN "AKg\p^CQfhI3P* >;FbF!.&il0.\\ A;3h4bITdoFg@}a1=>JḾ[=@X ;q /skQVsAoEV*N' 譨X" hn ICB@EJxg$@$.}b"P8(IaʝQ1g4Iኑ.nFxyJ? Pk[؁%V'XLQ 0n`Ix8ȠRE,%g[b!XI,_tP,hP$@ExO?< ,ye1Rި$Pi^+&ܣ` )B|hduN+!G;/`㑫^mb9FT2xF 9Q"q/&s酷H}-U2QBq*nn5-e.ix8ua\6Iz`+BlZq3)92SwpTg|?8m._&eɫ߁_$ !M8\q0Q 3]5Gf:bv~1ӽѶߊh0a9ϪIJJf1DAwC'?Y~KD_[.GRꐯenge<:4f p$L`՘{Zt>pG.._m7'ʢ@5g==sn\1Q9yiGz@ @3`}NY͊ QUһu53X9Ѥ|}ᙟL4)`z+JXAqOSI]&b"YuT3?X|y!H 7B!J!KKє/)0 FyAt^Qz}\M3MnJƾpG?]y]u k@)^'ϔz J2V8Wxc2$=c!dmL G5 >% $24=ql;Cփ'm,"ng;c$:@'-hWȬӮt;Rܾ^]!n?3G'NM:5Mo״zJ2|=/ކK[X߾83YGB~#/ $\Y5YK.9WˋQ # a@-"8P%xfh>ƭ荤BF>vW+RCtR%,G])p#ikћ+Amr-?OMDJ<$+0zǃb9: n'suGۣ?ƓWT}S=,`v3sVsfOY:*Z ⨚?{le9u)Mg24QLGS>kTJƻ^>Si_V]ba͋0Vs~,&dYR40+@hI 4 wʷLwpp'R$ֹ)Ax|;'03_^[R+d2"SX#D&Ayω1ٺ3FR y 2Fpi㹳j D-L MPW_1rv(lp,#78dJ¥a^ˮ-ڇnrx^cuE{DR@&"R IF*!QfraN)PuʡQm bKG$Wx?,5kflO@ϔL)4fД[(U(Ѣ D "KNPy sǹ(湅uq \Q 8X/AC I6' Q0yiD,?~izi^ *^Fp.h "Wz01!l@f 3Gw Voo)4~3㒿wB4v tஆ0w0>A]bpSIߥe3IYǿ>M ;.őՕPZUyRMk`2ّKN Ⱦt(@9$bxF'*82Usl[K->,Ap ho''J nf3\ &dz8{;ǯ›x=˛Y™ajh۷of}*F`,rqD.WV/׊D0>nSs`~z?kw?G~6_56g,9߆4~>-]q![|Ox986ҼƑaVC̕(ƓZGMe"J>=sx>m9dQ>&FmWӼ2>?O.}t.81IdyG2rQ<ņ^";_?|~)~xO~2?}:Ew% 5ҫ ෫ooé]׋Dz1o^ި3+üf9AX|1w\ub'査Oc3B[ 0%ﺙ]ƕ%oeܟL? hCg*Akј駋Gi- ) mmcs7VD9<1"Axea ɼP=3 %: q6*4FFIyY{Pޑ&jQgQgWMK@ЖGa!uX&)R#_$DrV UdSxbiU2{zy"C݊x>r5lts}bxY!ZPl.b3jy[> ]P(hy1EDM4I"FPL"THXZ}: G^GQNyf8esrArRG@;CaL b'V*-mΕ"~s.F<2LxST@41JCW/-u 烢-KtR:l {$H4^v0J0}匆u:;ۙ֨ƠCryTT{A'“"Yڠ0iEFTsT%Lnܩ \Y45CDdn#gG9n_1[x 0j/?7*dĄ, Ci4xo&Z"PRoz|*xGiB1 G c`I$MGV1SF=NqҚۼeR#%E)9Sb![\MmMj+&7=.:*NmYnZ, {Hd&)"G{5g0q{ k|1|ќr-Y]b,·ȫ٥u48˓6,W*޾A(߽!&d87"SI()z('KcHG1W:La0 z5 7MKSeװR9dơs>$XJu4*k豨+Vz-xT^]Fu%r28"u+F]er<uj"2}߫TWpc+ihU&x3Fw]]!^]BufFBj(drǺ2WWQ]G`dʣQW\FE]ej꺺TwrԕBĪ6޵5m$Srʐ~QU{6UM*N ŧzUTem^hP6* |8S=^?}0&] ˛8pGn*fRHAH[Ku$>e>Owڨ% +'Z"Ü˄OdX7mL¹e@hTB$IYJv_Q^Q3˽&Rp}!Uu19[K10!p)ި]T7.S4-)MViq.LZ*RRqch.㣱&APy6*b"AdTS9oS[~I(c g(_EEۓ,-da8fǡ7e5=. 1Xjիqkko[7,_Jai!l;qp_K gBbۥؔC=2 #4-ݶd!VGu}QZ]:zG:]ZI\Vz@6()Q)FXBStvtYk_>C S>?ɛ=k狕5ö=`TgRjoIg3 PjcC- [\!@B( +gDS*DЪ1z8Uv*DD $Xe!yB2mF(CkMd.X!"u4fbjpJ&fZ>罥BeĘ뫠I]rgW3r y`獐}@M=]x2qGAhqՁi<$qD'T"i% EI0·* Sl o 4($qɍAe&&,]P<cTKa\xXǹ詅ZfşOVor}ޥzwȫQm_17 d|=AI[ӷ:<9%;v͹-=ecsXY{M%%;~ ӊXJ#WIYA3V1+xI 2O&mD@"2:Ř'y[T8e=tcEѤ$R)\Bj[#g;2qƮXZBa^pAQ~4kk NۀE7w[]W)yn0بXx q-%|1r(쌕A`Q[CP sU'ԅ {.2½geJ%Q3uBێ ,ZʐBFˈ9˸db jnuڝQ7 :)y]a,^!1F`(P`ZLh4!;_NG\Fi5.m˸h;\pqdz M^5 ډ5dD@%S*d"DE5Jqǎx6=a \kx >hxsWq;%*4'3y?7L {;3Ov{d_9ݝGXV5?5 I ᣀB  a) h^--%SkLN)hd/Z{΁;)$i_{\fY PIIǹv_IIp])N$J")!d`$PO1&$L+0k][l7=> ІSkhִ׼!Oy܊uJbrp< h/Jkεq&9*HƸ4gҠ4(6yj yT 2 H%=S2 @M$*\"e5r5.u(~-vӯ15/h2o9Z~t?Etw}`^ަOM σ7 MI1'9 )ev{41 ú6gǝ$HM14n1jL21^(L3"qb4LU3əy=b,9A]h{5rC~kք"O 5p]TݚT<;n 6>]d8>?q~|盙{ߕ Kb\y(fNn29>Ӌ/Iuֈ}ue7L|{F/_U\!9\L'4\Go&hĘs]wFIm:_oɇ>.>xCtϿ.Q}"NK9OK}?NǾCj~SӧO7=, ?ܨ^_f4cuc;NGKD~Y7XԧO\qb'Dtykե%{{9&뗳3#$7Lz+sq-ٜm'GdK[/ntYaBvmYP]vi躷̫gl3CEDxΖ{hʈ-!q\8L^T]4}$h A^7E>uBpy?>ן΍d8y7MwL:dW7Z4j*'Ty.aJ=J^s7ŊbX}vBb$4vޱYYmstq]M=d~%v ć oc^Yl]zAgR!鄺JPQTALTdx$ cR RS5pe>pTqzCP~]b%_wBLڛ=?livm&ڻ7c@$}=ѕeaF.wgy2?S=%xrN>8o{ɼU,٥9E%v2"'ZqNy8qUW~p}RMN``('+`Ѹ wy7K6p?wmW:7͹czOީ;7#W_DggQIR$<r *ͣS\+.r-'X&35{yu>trqY9lA̕/Ӡ~1oviY4yy+;{AȺ;)NBuaoc,3ȞƨgZۺ"= _Crݴ{[(],"%duq;˲#)6m+pH 3Όau'N?,'zzlxep* zªi=zK_ƓX_]jeP=/s9&>cJoyee 0;WCw?^K_׿:ȑ7}RrEƕқL~|(}?z|ayP5sӀ+uʰUFi9gNqމw?-0w!w Ъkoٵfl9ńӯm9~r_Tdz4S!1@x&NowM^y#@~f`N뾃*I lt) $ 2F]r ksLxoŃ$.ƓR )XZ U,h=wd  o#/Za=Bmt(SԲ,@znjKHNaP9)L! bmF)-b=&X%R,T%X[7' ǓE}u!E_pbW-G,Q)[x_Q-V|r%)Z)fW5A5vT`t"뎖=YEip~yr<ǽYM-qhj7.xi{_J#f >N艴lNM2 :TT0: h!do$_8;8ݿ)¤kdфyjt1R RL1RJV[ȱCbFINl.7ѧše4PnIWQ8PEYF RQ"$Xˈ90}Ae̦[8rz猑rkZgVPTʩq ii+r7#g7:􀮆.]Y2,v]ݑ/&cf[eO&)~Up' )Sd- GbD p^,K _v7`ffm,g2UGPM}0[m&F"CANE4 JNieQHf,qgPhޤUo/wsϚ!kP7f)gCbQk5: `&-*'J: Qbƚ@QZۧ E~d+)')ņ0"TjF щ⥉ ,Axqzvҍ$&6`B4kHt֨P3mH,219*&R5'#}If)$lWk-%"o9Imjƨ `,@(0m5] ==k">XDz$Shưƹ߅9 dI*rxáHG3t`c z9k /hSt&ݢSH85ep 6y`0/ߵEyA@J.`tyF AwUo!珙72>;d**VJpqa'UfRhhSd׬p}d׌Ȏud|H:o<tuV{P\@_)ßg=U'icɛ̀B DI5[wԘY-(MwޞHhfD#}[ *{5{-~=7d8L3 sMذkC޿{”g1#iCg,+*]yM0$ǔRRJI)u+Z•d%H!u%yd)Y*²]kVA^I˟qB,AaG̒QHRR@JV c[Z%r4 \/P lkkqrx7;IyOΞ9y;#^}$@)EiNy^3#h&9&^GT' '2Z)C٫Zh͜j:ڇ56[ i)sDd=}} 28P=zx,3[u?h>stGjԢc4֙:CbW#j-ӣezL2%K&T LZRљ JpQS9zM sH6msBQ^xsNdJcT /SʢKb*)dkƠ9{ZdOìmυѰ@ {ۏ*fvx\T|/d2=/κJ˗wVfwz 읽W;NDn%mZzfNPW׷s&bx[wz6ںztKWmv:rˎ[v nۦ7_5%No^kjͱ-n><6~W2x/߸Xp?rw{Ś/{Gx6iOh?l}uhݞ9V!+ -N9S 8m%vQi;l^B6/ɏaeG?N|Z_,'2<F8ݨ [wÕ?)_\(B3e &g39+'Ʒ;3GϾYXPM3=a٧hzZo;O :մ|,VvnXhq2hR /xѲB-go..'\~i0PG4(f֗+`|80HwY/O 쿟KÍk_z|'0Efʥݐ71k}\ |sT(}Nu]MnRxFTAZ˫_z!ήMi7O jրYp6ڪVFᕟ4Mͽ=z{?+Hǵ+S.v +=f2+.WjSذޢa/[*J,H>1w[LK&B:w(~nY׍瑧i;Iu_ud50e|1hfNdR$- mePl -2hÏ<~ ThXp jW&TŏUs7~I V (Mh/na2F* ЃUku}" IB"$-~ \ J*&jTYNH6Nɖbd?^|Fj z)7%-N6RS56q|vv9)4zxHi )3ILM ; Y2ɬC{.f^Ƣ͌1 D<7*%#:S D=H!݁- >w3d3FRBԛێnYm@qBTb*(ITxX}gyY@Tw%bq8c xT^Hth Y;c֫)(RiI8? 1_·(U#-JXK"`YOU ktѸf/ JUNd*ƪbHs"VLڢhXJH5,$0³fy'Ȑ >#8o+ϵR,:(BhZ+,Ej8&2j>I0`4{MY:,% HX'in/FF$ʃZ] Eɳǚڈ.tt%@\CI!.4A44kݡ7Wg\C^- V5hya=Ҵ>P k9`6B@ъ^[h_;wMA[S#LI2f,؎vBނnb H6XYi)̨&4: 88HVG!HPꛐIv4~+(ӉPu˒M%QcAOuH+j+;.jPՀB#R. 7 c:Zom@ 1S;Fc#nC3!vD )z+80)8Й5TX7Rr Qt@% "V9%ah_ \I[(l No!A۬3Wڮٕ(ދ0N5)eM7 jJcxJ7(4~nk3RTiĬ⤁1&bb NR!'D_ 6ìrkL.Ђy/2ӮD="[˚w!t$MY G0RP8x>m9@CVLAG= ]I4Z*#``!&SAyCs ~XA̤ƂBgs՜P$rDMWD&deZP>6xOy |Y%@ VjxQ o8t`u~R$ 8ՀQh%;+Qyn&k)' O7VE`U8_]@,M$C.2T(ʈ>bM=A%xB>hǟAjx1) 7DD.c`rYwk)a*H-=V nK)Kc[ЦIAK^ Bk3h NmB[viXp @(Р pd..h]qcQm\g$0SmE@iv%&dDTDAoqVUpVX0&a!dE e#@ [Ӊ6k M[4B:iˢ9a$Nhxo؈-'Sl;b,Hד%\I"uAAvPXfR4Č*m )w!n1ւM[ [ƮXdMn;lE]Q("Ndviknw(_*If("ZT1"&7cbw ;^CTXd 4 أt%h#*ՠhAgp `NmS; "7 3h҂cMZ b $_kn^&&ci` Z W-EB9y9޲NKiǠB]hN7[E05^%HL)-X2'Il!X7ϦrUpid" J.YxY 4iU.'dS((jI*\ m\];S9зۥ^y_\͋jO[ w M8 >ˀHrvFSO{߯4 Mjn! Ts}V$^M꒾I@t'N 9)Bx>J n)j@sJ OJ V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%U!txNJ QHl@V'V*{@VY J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@zJ E3R`ͳQV?%YSWѲ%*Pg%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V*sz'QܠbzJ X'c% R9>J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%Q]_ZoK9-5^/o .fAYW%, : ү)훋Ek!˭CЯTp#d¡A8 G XZ גRr6:KWY\f $.ɽz!I|}xtNpdQ{0/²_:`N@. F""Gã󦕷CדlZWc%dkƒ`o|_^oEñ>z=ׇw=W?nD@;a|V8ƣxX8@tEg:KAF ReEmAJWT%ܳ<]fvZN4ydXg?&;WyZ^$Orq܎/ C٘^̇_[w/>p?ػc䞋!r b>gx'@7I]Sg9=&RK3l4jqs) #F] VBJr'["Q9_,NN׶],ߖ쭤 UMR?Qյd&94i`@8tE·^=4N[YrF6/5C AFZNvr'Nvw=Kc>v1 s} F(cz=uh;~ԐDJF: (9;p#h[@~|`x񕐴VMe;N&k;ڿ[;FYal>jgdOZU?^trr4lVwC1TZiZ<Yl7#.ܿ_uxXSZIտxGwo~@vg|<{Ɲu~v|rUĿ 3<=ޣ/Bc1̩ڤb04S.(չiyeHZ\ |w|c=g6e9&ҘpSi7zZn"Pc!W]^tճXKOoc]ߧ՛uhu<@ 4P{+Oj;5?"t9;ovHO%7Jhe+@W=D>CUVZ*GaYYx}[r-=&wCWR\$H,*n]A$+\MMS' ̹;wkr꾀`ÔGqwa7_,unj﾿܍(W҅WRWRn9k#f'|g{7iU.}b}I{ߵ &ai$F9 :[6)\\IƏ@T]*oPض9caxM\@^U5*9irC!c}3WfO tl6Gs q ͂WqiٜZ7߶֫S\~ Jtv5cviw`M-tzJ{鱯՟G[Hp-Yl`_V͖vѫXd$)w# oI+8uR9ڴ<Ŏ3Fd7/ OU_=bny ky `fm3LX?Ic'bL_.j.˃f7gu߆J@ܟlIvy<~wR^su_{y4[]Ac] >8AtK#|qoztGϭ};-ќk 9k@:8L9]{TJ*ADFN']\N7ggO܃>=Ҭh|& cbr BPԭd\m0~V,68aK8YtP "k,K4* g~Jo*2we XKgD+J]nu"⃠{sE!~,'`#Yi H+TlN$h4yx)`Cf*;iMSq69u\f&8*>3b^!s- ؕV5 e(]̱8"cZSD}<_DՊZtm5VRKԵJf={ܲ &Zeu[}ѕdt)&H>#δw"ѻmi)p ͅdkRb8qL8muvȴ1:Gjftqޟ=9][vr{M;9n'9p{GA}tpg^pFaUM״MtcBD<􅏈6X|*7<4w1-{P)˚h`X}=dzA{LEj66M7rxL;06O%:&FG8:K=[Ra-(7_ S);$dWKG-@XM>b4/6Sn<:<֭M!- %=dPh+aC5?h̟Cnʄ#LQy1K$phƺ}E+,ag7{`l,p,ֲU˙ _GKNMQdUWd=(*g)au,gc~ӦMY)upwuvtqhZDJI &P+w^% 9}9 YIB텷.gn41ѱRa}m(MgooYu'K) gTJ(K+|:3M>ԉ$C' YDJ)NAt.{rfW>(@AB è«r_pBg b˧<_O[L8OՕ}Z' `uxy*D vUJte%*19lNU3Wy٫" d>.<#O Gg^g(GDp)q*ymqNfp&I/rALY .%eKܺHeB#@=aSbkcZiS鍼bg 5vp#m}FΜ[-"{U5K2?ݦݝ}/=y&øX/5c6.sApBer%D8aq!aNCח+;reU0S#Q6f [Er:YW6NLbT21C7p[*D,rdO_^9{&[&\1d~`<^nv%ݺIZ{tڱx[urr޿t7[}JZFj2ugHAc [7ϟ[Wۻ;:]u4e-%}ΗUyyv~yۻknF_?>9*7-6&\rzmƛ9Ҽ2yIS,!g(L%ZgS^Q}yz ;Ĺ6N4Gb)%1j01. H)qαTJ T_(E锶(a[|ݑ{q"նێ?<{;7L[k!Hs2r֘41 ú^#E#d^>lbN{RDr G ک\/2!!.Б [&HRLV$m5{+/Oo5"yc'xr{Fw`N5&7Lgwsy7-GVfIC wlwpUMRPRgimo8U>t}<"td2} (pa4a-&T7(i:-#zn2G~ҖoKmlxdwld"Jl[Anm]@Wu4U[eh)93,Pk3mbcί#|Lp΃]'悤>?Bp| sڵB,R2gmՠP~cPŽ/:II^Qz!]T'U3oTnL/vja\x[3r>B5R-#l1U}6pU͛C} asTs_ gUe]HU^r%& g?~uQ GiCr %؁%f9n>Kl$ԡ ߁6(b Dh"sN. w ~ "Oϗq}q W2mq-߽wMbn~_'+AvC+/reg4w+ g7;Ţ.=z&eǫ[OFEY8{j:+#)3TJ]@]2EaU"UJL2yM>+^<*H/kV&JW h"tJTă\vSV>՝}{Mz4 yp~Y>{ 3߯?xMh Cgz\kp/eurEBQ[ Q\$U^3 $Ol&6$Qsa1rvƨoCb<X?ՈF4F56V@5Nil4  :ZEIQH5|OՈPBhG)8(-XzFDU&%̈́NSF,Fn2T{ԋu:qɡz֋׋^ܚI A;3 h4b)to5p3A׋Ћqǁp㣄>\*`q Ym{?>##ʒ՜ø~LpJp& "VJP MeR"]DWǭ8H:Dj81^=;)$iO~=zJEPIIǹڲNRRƹt-"Z뤭M$B#@=H{L+XtœbΨ;Rd%g]%\{$./ANdECCg0R>,~`hcJJHJ+˜KٜWRm9S̜2;#B1H$jǍ@> |mad>E:̱P2OaV\^K묔QGCě\(UdhKS$a.FΞA+>?үA䍔!T6`)?JesϞmN1-wzt7#d{*5,8!7"?j5^%QB :AH?xZfVEnV*hE 6L @PD2-Ck4%":МD BSOOdžq/8SKR YX*NHJ8s\Q.wIϔF\.@I9"Z 0 RYA:ȧ tkN7AE}"4Ae䠙cDEz@:?ǻ{s._q;#&GY сbivEYLqQ J=\,7POįLl`˥=%pV`K18oɼU>ȅ țSQB9agSrvÙxGgYjy}5D ,W@H.`R&>ND"7 fx|T8yS2y3qfULp%oOMt~j~=}ybTE|T^'7+t4V%Y?,׊=WgJȑ"엝}yI& 90}U{i>u߼(_j~޾is5 MSŶhZ✽|vE.7vdˡv:~@T@mL~kzЍ|7n(y DtrQ48/8\8Li07B5̂(J_{ǡFڸRiR9~̨a)-LRB1`fh;rӪLD^Oo>(xrM.,Ⱦ% B /lP)b0LqL:Ly ڣ[d.ߧsJ c b 5SbT^'0AX5=.T*W`ܻ"^,+X6ayI+MA&@.fZHW3 f(;KHz`,Xֽw a1:o*:޼on3A D1W c4z+n1$r: S 8{1+`:9;/)[h>%; 1|dKy& HQ>yީVkG;vx,0aD;ݨ -7wcSM;6=u%SrdYfc]`[.>6ڠlK6-{شN )8mh~)I薉GLܞdMI7ǨQ3مeSe[)?h~4|KMHv{+.KFGĽ u*QbK塊n dLEG"¦X ѾmxSZ;0$唩7VaS2p⒣3 V("PNu)ZM(EhE6U 01H ⴰRMk0C)mp VcR| #66{ AHRVa ˫e)V3jX$"﵌FTZ}6rUQ}gG=0\֙nU솪UxV^]2R16^SjzdWs(]@t1/ztC2,+,*nݾu{e͕Wn[ԼRr=?LF5oG\yu~K~ W\8p^(su伻:^ɲWq |]]sU`3!CD?3vp[lufyr'#-uP De=uQn?⣬.@iZ[漽Ӻu^ӥcеL) r B($@b`1@1gz4‚SwV]6=59ot;B[2wհv[@`ja˜NPsZ:SI=J9)%ytSE?uFP`n^aIZ.@J(m0kMVؠ1c[Sn v9OR=Z0m e`(RGK]Ԇ dDF8ǜ)KM䁀hIIű 9N`.0BʩR ˽);YL.r6 jx?)sZ&7ڐh/hOa3ޢEݫ+i*oNk@cOEaaBD3,N2UtRh54ϊB}o@ECq飶ěU`̙R191 qƾX(2cXxEzv4ӻIҫsiDfw@aџtMBFO{JH49cEDZ*4hk(L)XadElh]($cxTH'KLIґ0K9DhWL̶v6efԖjw:(1i Ƃ,!ڨM(*!+F.qVqB3  D9#`<0G:Hc2a6r6aiԏ>ɤCƶ b6UfDT"vcI Kw:XgRRe\(Q@+|3.+"jmD zHL8cuD"X94aKn19Smu6uf}qQgEb88" !N!=0'BD7i3^`Cb[0w쉇9og<<ep:'8NO  `ۋYWGDB0 *Tik՝c%!HB~V: E2g@iJ'։1ZX(QuS'1*Zms+:2,"1rbrX+H!A#OrYw.!t= p&δH!zkœJlLQQ^5S ~(j#el$h;#}H8D0r3-1i\m u h % &pqg>%nCYjB'٤~&-RX 2CqH&vRr[v}:Dfҗrx*ElWq*P kNZƔԗ;)B+(cTJXObpp; kGm>\Mge|d4`ͳߖo`qzvi&伝\KAOF/|=ew}spI+w;y%xhr_dzY%ӓWOϿ7ox,ޓ`)z5?~˓7u7r8:' ^uRI:}ngngןh;ϓNqߩۯhTȩpsU]{M{UKijz6ѥSmwMnmFO_cl^Vفo8\ =P_^wWW ӯnLAodqFT/&I/&ErorypE2N/)x4G 8"[ RЅ[laI5O$FsR.Ƞ1dUɘHO*HXGWb !S&aJ;f2Rʃp-a^jF>wlllycrٟX^JE XZ0?ɵ~9$hsI_iJom4m?4Lٻ6UwSC63{`0 b=bݏzXV$[۶h0VdbW.x]rJ>$b(XDMw*@H$,YA0V妪 I%^ ,b,"MQYK딅huwVue,h`l"q_+-C1<3,u#qB@Zb:DC|).JȉO 0|t_K6 sI; T !/*m&D$RUkS;RS5hkpx3VfzNFpV%LJ'sn4\B{#Zkq;M=բz.8<άIzQf>U09 1ꃙ\ri}O/D]3(֑5)kH-9%R ut&Y(L&&O7{Έxb(DPل6 zrނ߶^6V4L&b?F0h=uOOY/GGZMӳ˗%QBMHrz\u^U).6^Zq\KW`7+}=ݛ>^݌g}zU{eplAiAtl%G𗋪ȇ^bpyԒ[\ff25v8juh[Wpi>Г6k: U%ouMn?qn=乌ԆoqҫHFӽ/X}׼0Ospzg̎W?/?^}R/W/x y0mꗒ ZKJoz\||1L?'n_oTnp\հ0+ig80ò _4/پiĮMk[4me 6.ݿ .:V881(zÝ)J2/)_8c2%Z,AP#p ֻ %" 8̵Kӊ}yUuvUт1] ؂`d)sdm*jN';{qOqe QGWf6;{ôUôôWb&IiaOkZ(6KAM@`Cم 8d5#աĚ]uQ]]"uEa#!1PBȥhIPPXKB`YH%,k,)]7jJ ύY4e }2%"LH2J c9[$\QHU35Ļ>xI멛gEi,|#jd7-t@CtݐY~n; рlH"kQjʪ-R-BS H>wQ`;BX,a7CFc_@r59.8_fYʉD7Uzw屦*JzW̓9cOÓpsҾnV{*I6ueM?5{r'S i҇ALJC-9uZZFkz H92[] t4lt凒nW 8ra;J 4 9( ȫPtƖ(uaHeJBU=q2f8+0jdʉl%i:T}{䋛NZ^ sb7:~u#t}S.ΏgUɔ=FDw LQZWB,3.EJo;*g25"1n%8z MR\+.*Wr*BUKpg,kq#ht^6E-}u!EpMiޫy]q,Ȳ4voY^ ??ߢOw܅% @r&} %Nd#K<Ԕ7(V 7{m~ܙ۩z_ڍ3^>чo!If|#`؜b* +ct*^WR  ЁMI&K]zMsݛR]4-KL2a:7ax!Hu5HeRRB!Ƴk%Rg8d\G=l Hw5QXTE",Z{/MEQ!F́AtW87Z z%@íV2.kY8Bxwi %ص3rnUUЙ69k^_ec]%3ҷ8^daSTnȎ^ů: ne2J*!cH?"jG@"lA]|qA[rUcx$X~N;3#Q> AJHѨ3޳FEV.ڬ]PN,UZ/$3r,)c*rZ/4/ҪܱfZ;R.®b6?} {udLbXDO?4|4$.J}F Mv ]3E~`gSbOR&ȓ9Nf-(NBd`AԊ烃tVk7dI6DNA6ZCl0*8:Q{3x$6CeUqad8J¤xW"RP3FEOgk:H[w&Iy'"|Ȯ:6eD f &_qmHh&֥\ exlѝx,XGgE::]w,{t^L)7ԜGfcH2H8vaDkHЀqJcxa^RjORJI?)v,lIF>RȡFjfJJ4l:USU.8!/A$#%%G7EҘj15LDNFbglqr ((RVvΗn_NxRׅ>J73''6p*esW5(ݖ韝WY4a Ed1@Q.{ R(I& {+q^൓HANZtEkT-u klxDW&4u_"/!'Q)P F%$æ,SXK׎ȹj=jjp]̈́95Gd Lx.YA8?)6K;s?u]ƣAs6Xl9/o;BR d2I4l >==ᙽymcE'Z;S2B$1T8io{voS/g۸Xη Ȓi6!2ko늒Z꡷$mT^4,A&~u|XŤ{¼(1xz}6S=x_YZ[u;`vi-mW{"E]Wm̷S[ X sS;K2B“i'WaP/t4t;7ȵ6yzQ)/s2#4SV[Q$r֮WYcOMM'?6Ц #ʬ)V;/%ѫgGߧe)Q^[0vzuz^og; .5;kow/T>z/7]/ơWXՓeqدgdӣscb ƫ^f`濞SթijAUO**gf~zpYm05{s崬t6~GV= ڵ,< qzy;7fr>/h6Ri۵#ZD0k-;6zsn 079뮈ft²R 3mQd{{fL;:SkV*&ld=#VE%{a 8vߦזYxhJydvRAͥcfYGVz9(pŠBW2:I4P^פ(9fisIfP;;b8TُnjqN.JQ/I+dn|@uBj;4jivj%^Q .k{/.Qc `ґBvQa$(x,;vDiF* A\J}2ɮ3 DeΦɨRȠE $e5̀j@o]\A2n@Ơ) uڀb(4v7Eq #BlH `L13cI1ά :Ÿ]9 i+XGeґ4J P@S1 Dt0GQv4/ͤ#kGRP6Z'jC'feR]+Qa4uYI5)eBi5S<`FcxsZ\Yv7~)bMDq|RT1y:*S{ B mB0|~oBmЖr\ZZ|ou/+ޅL}6=Dd-D&x#!P8x>m9@VZLAG= ]I4Z*#``!&SAyCs ~XA̤ƂBgs՜P$rDMWD&deZP96zOy 1|Y%@ VxQ ox6cHp+KPCKwV4<ܶMR xY+N";>nwmdZD>"CO,%VX(;KSvpC*Kt]H_>1&!Gu.b,C)9iDκ$!;V 3L>^B ;f!3ڄ`1ctJ߳6"}Ia A*WRw]Pz`*1J[$=|"2@z[`}V="Ö+YFzژ[QW6b]%r,JxyŴ ap`R;J-*FjQDFb1;z!Ni,WQYj0Q380)`cGjt҂cMZ b $_$q\dd, D ᪥hsOH:'/Gi:?T赫-u|nj-*awU$z(qcX©:ltiP/zCL Lčr(KD9)B@yކ`ݤ>jE#WnV˥!0r(:ff5ФU4#!:.P,=uQՒ-0 Uyۈ'3twEC@L ##B`j?ozF}ߋ+<HkhYD)|noANO+-HzZ(uȠx1rl(fAB P26<"%Q̍E D+`@SSy1;yb54yf'R-iGy}Z r:z`sYYT^r\7~īB4[˷Bm,gggxjm}iS97' zFIʒUSg[3fM-OeۛMC8VDL:P8Cf9X*LVa Ud&0Y*LVa Ud&0Y*LVa Ud&0Y*LVa Ud&0Y*LVa Ud&0Y*LVa U|UZY+ 6=籑Q<VN|*LX) 0?-m`%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VUQI d|a\Jh@V}J XyO@L@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X *lQ=x@0=gYZ@Y 9*\Pα@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X (o7Na꧃'?i)]j{}~Z]{@3f}Z 4yf]g A&q>;{[jA:_\P/vRu^ <ܾW^r]sxOj].N ۷ fcz4^c=.kf4$9m羶s$&ML\諒>*N?/1 i3\yo0"i9oʒ3)PQJ L,86Wtu;WOF%I!H]ays."x m6eY6d 1^JbӞ|_9G߬*DJȝRlZ쵭/*_9rLY>­` l1ězWכ*|g>1jCFjrQY7%Z߽k8, [:]=0֫úc8;:ݵu:?{ kb]ﲻ@n*zR7D΅ځ ]_-jE}>οUY:kz{Mu0m<,G7i?΋\zttuݠ ?zDf9|ve7]̀efoe43;5N+f}U.F)(iW76w*Nmsn[nrcH_Ll-{ _q;V.,s'mոa 6GiĠt^A?{-nCkd6LR͞E-=vm,Yt)hsʉ|sMmV&B)9348 3mDtIxH$P;LT2?91$\ %#3}y`F%ࠤR>pPk(Ɏ4IVINQzy6}k+s|9Nr_1ZzkD)]ߵ'uz͹4x}~q>{8 qҎm_ݧq -uH@θ\\h\``ea՛MhqrhzQ^b {S./yo+B zoc'Zзٻpq؂6mZ/شj{yr~qbyp~ ƅ-XpaVb!BZ"װ?:$U}r|wI2dI\R]:2& ("NJxIZy*'ڳ/I_! COgyE|Z9*nmnzTր3h_lL\A-ڷ:JdPU^V$ĆhbCFNS Y'&f * ɫn7*fִͥ|k: 31u%J hQ'X3'`e_^}#hqՁi<$qD'T"\s$0Lek@U*\ɗEp,$ьKn&&=Q]Auzp(5Nu]?䩔A)WUU¢ZynrzURVPÌ E'I6"yqvʢQbLޓlV! UF꘬gA4) !*T PI֌٭ab8TºPuiuF-{ԙLt0i Fkl%>҆8LI*O4Fg<;c%ˆV,5D-r,̱ A&{.2½geɕJx%f6܎ ,2eHUFa]Lۏqb.6:ں֝$褴c9 {eא2䀃]zl҈ܔT6\J2"C*EuMfB:*g1Hjև٭kR߄.ex*8T#tӈF,6:ԇ%~ևGPa?m'Uy\@ޏHX)+ ?%j8*%H2yųVZJ5\);Wǃd9BMc'Ƌ֞syJ,I_D9G:qXnb>ĴZcy>"[Xp`cJ`}%$eNUץLT[j3fN(Qzg1BCBh=u JVbZZg4 :Aj;VQ#/KäPHBc E"3a% .i>jKS$g@6=h+AåTY~pIAB.Dӻ Mu;VŶ;k\t h$Sa ISV5R*giWL .Xռ&*(LH\ŀFІi*H"#FSqFÉ i-n~vw> ;fPb)F'^ I "'r+J K]PIBFUPxʭA9#Q?niZK> s)*pύ>IϔF\. @$zOsAGQL7N|y!͠{9#MTj zz1?rm#[s>iw1[nov@Cm ;ᏂomI-]5ڛr]QY¼'m/-UF -n:\ZB%LQ8Zt]U4q/$F<)jJf!D%qZ&#$"/r)kG38!I_(=&eJ8Ax,x&8@\ QLsa! HY|2uTybc ?y,D ۗbSagXչ<뎢x|@%h| 囹D)$*댩 `r\!f( ?8hoGfUȓ-9WbRHĭPz!d Ns"wJh] uƭr4>D=zG^Eh%&V," }M\;z̕aR9}5qC{X" $Ks "H#c1'\1+1#1ɣy=Q*hRo A( 9[|ȩe8KCV觡8Z= r* LD˔1Y^QF,7J5!R𠢷bqY<|#Tlm0Da,!c'.1P|@%L TX`M9.&89-$ ~*BM$'ZD4qn-M@`Iٔ@)DAs W8cUeD{Z329"0՘eLlqKSSBi+w1qv㪬CO('ErOZj){4^*[|x4fF"HAy<A$Q 4$OX-LDA֨Z"@ +'U(U JB 9`Ť%B"D)50IC**h@Ѵg9덜rs&% w%c&(ȨH$pkV;!-–rX2a"Ч||(_+ziIqTcays;M}L5QGFwEfS7S͇O~ҹv7цHiÊZ.f` iM(ý 2΃ɊűSiw{vw ж;`i`̄FBYw`&h9LcRcG.2N o|_RtXz+l+|gke;1{kX ˤ:* ͗s)56ʇ|&@dR,+xr+l+g?",M `"(RVEʺCHe_) H&l"QĬgem)!rFkdRbJ-hs|ZY SAj0XDrA18) vIeE Q5Å<e ;d8|)u~ӤMY*)zV?uww\! `P c5QS1:rS8gR: o/[[o$]ݚʙذ.rw~hep5+L467`x7,8Spl8Sϟ>832UZ,LɄƥ"6fu \9Ԯ$C; YDJ)^tYL:<+6:"6GPA`WU it4D{S(sVA-m=-ضgC7r:\]2܁46Q=v[/_cw慴|7ݵ'9im Yfalޓ #;̓J3~ Ӓ G6^rÜ%Qi+D!t6]o h;^o3˕Ta'0 $H)p*cScbW[ cB)7j<ۄY#"( - ̦HGƾz#j!Cl.I6gNl K2u_'`6}z4}f0&L-M` y5.ffrt%4Wa dOxJ׽/ l}z [!" My^qtAx0kƦ҅-,GjprBiIkV,;'Qӓ5wIuLG _^(' ?0ãhwf;LMWiQk."d]Q)|,IcXR} 1.R:iT2mz (`FĀygqgo=TO+GgWrjհUe N`^2MO.V[I~"/?hlfN"=Sas$A$ȰV9SeX a3rUΰV9*p[AJqa= aa3rUΰV9*gXYnYjpdUΞdUΰV9*gX a3rUΰ߅ #gX a3rUΰ.,*gͰY2rUΰV9*gX aa3rUΰV9K[UΰV9*gX asI`V9:dX a3rUΰV9*gX ?dX|8ʰV9*gX dXW!- W3rUΰV9*gXmH8}R`h8qg%sƗq]FDф2X6s ]+Q2U "Ec(gcG)N0ǔ`;uV 䅑Q~ωLG eZ30IXd`ͭH{{#c:!CŘOO'&Ǎ)LOė{jx֌چ3;TD߰x:e"֪dU׳pǓyUXv%Rfii-'&BWj=V= \L6fJPj*].J_^=thvE2W׻7 d %˚[W{Y^yUk45J^)o/܅ǣ-|)CuWT\;q;߹V_dͦMѮqX<[.1we|txLƞz:A?z9 /]-Q*:I1E!H8!+\"ru: *D8jRڥ% E#,]Ƚؑj]n_޿kTv 3KUDj7bx$>aR> #qDJb6Y#܊FHkDbFZ丈Ga87) . cP>] F9 CQQj|~#|!˧? ?ciڅdSsx0yo"oÇ0E)5x3Wa0<@'0;ϻ5T9wWDx=Ês<߇[e%͵ ?5odE)V5m&U!h҂)R#'6g6LZ݂O_> ,{۷9jW> D~ek+ɻ`8oJ[sQ~KiU=!83_7MdȞ#tn0'kd܅Y],Ū퐿EeOFi $z(-,: wN 4h2#%.1ɋ|Ibʫ*.MHI93ΉW}nMo'3k}dRy ΜB1Wx2fX /tWőM1;Sgi۞ZRfKCn/{oҖ0C'r[h{Z %E)8fhFn#(bg_ $]no}8#oj&ˆoБ M_)f#ĶQ^ZAgVBu 8(ka^+]2QQ W[؀va}ِf K MOwPf@S b~|{ǾJPwkPX=@?ZggWcfPKfVmqV.tš؆ 3g ͲTҢ{cKށyJ_^vҗ' K[iԋW#;[S҆&,\[esa gʠYZAlRMCsiSŵyr"5'@;bmagu6f\~l)p")Q!I8["ِ1"BYnNX~hg;*^,jO=gA=$* wK'@~CZa 2 -']@\siqDzE%t)xc+γoй}Mv89G)e߸!ฟ%s8.w\ΫZڒ-Jm|,dNNgD]z2m23mY&omg C01F: lA2ߜ <L;n40vvWu62}{eOl(vځtFվa 6_lM٧+-Ϟ7 blƟFDz iZ ж_Nd]vbƝp8ٸ[oKjkq"yk=xq͖=YZC}fiXY0zͥr2<9J-c1k+810 .fdrH8+V׃.2wopӤ2$Up;oB!v jg>LLj9+NZI}.r06<'A>R8w}^^7gyWw[۴0gSL=)e :c´0*.e'>ТAkL' -ڔV8R:K("r4V ~EC`lEK_A:D+*PJL[:ꀃi<6(D̰CDhƢ٠w ߎ}&?Hk5yHy-uQ s+,5MxP_Ш@+$ jxTs`.0 B" T)JYP{#jA ::{Pi'B?mC?xϡ|WÃ>wW!)O" 00!J c\IFbwmK  o/{68A3>%my}gx1FeOL:~]]ŤIA<h&TVF$x+SF꘬gA4) !*T PH-[vViMFBղ,T,|iE㫌/fV]¸?Z?uOhKl%<҆8Th>9VmyvJi66XDkZ 0ZX"#{Y\頄W"jfNhcEKAѲn;%₉/R5Uj떥v`7Q٤)5I*7 RVtQD A1`&((r$Q ڹak.+~zu) }ǮѴ,M';*ड&OD!<'*񧀃$@5鞆BJmcSHyԂ,g|PrMhI3TֲDl;%3(GӠ *mrvr#iHN !'#'*RIMV1hFŇ"[GpRߝ-Iܺ$Mڍrly?>#^KZ~Qx5hI*\ G$xSмyVZJ#:R?;81^wSbI&zzJEA\pcmNRRƹt! Z뤭M$BFcBbkô҈Zwcg O!qf2TsT&=_$OA N$mAG(&B)e~D wkkl 69֗9h~iSeDy%Ն{5sfؿb۩/킶LFSk̐С2EOa| Y)!&CAo jMӱpDaRk(c$rqƢ!3a%LZ*GmiĶwV[NË03i7p&Wv sOzDRiU\'o(o"un׸^efYnP̪lȑHPd{*5,:!7"=Ej5^!QBȖ:mӂ_u|PN-iYJaB*4oi-y BE@PD2-"k492(4P]"($S`ǜȬ +BRBB[Qa$Ƚ *O5>(g"#G!pSM_K55a$$\s)*pύ>IϔF`.' I9 Z(T`cvi'_sb_!x3(G(HZ 8ZDc W=b?W]ۆ92B13Cb׏BSUο\J4.|ȅf~|Ioo5LT{. /.孛vOI{U: bB4>#TE΄@ޔ! ; ZqNy8thBkRh\B{K6ެ'Ni*Ugdnj\w9%I}$zh|~qdTEhF5( *|hꊳ| 1I.kM[ul"8[l֙s7HY=vaImո fۇ d]KmI-koƚ,;ȩJF YDm{ CX*#[r]ƪ𫜵:ʛ0ܰD3Sw!-.-9+e|C i⏺ޠ~];yׯ^'/^;̜'^@s30ì.TF/̄^52^}~1?F!{_O >gh6|4Ed2c4#ډ1ߚOˊtVzklilo47AӚ9Ʉ7iW95s]|yDA h)&veܙhNvxظ5;FRb6>!*RjƓ0^vNYe8ʜANSRVegti.1$ML>DisVu*=Vk4*Ί>;{3;T^밳4컳Vu;cQ LRfJ)OǩSI"S Ad@B #QP9`'Ǝ̪%O]|`hyIHAJ e.IBJ& B4鑻'- o Mvtq_vޡ|Q95/ l 0#Pkc Yɭ~t,)PB E d(E}RpK(1 e$}9vnfAmHN9bS%\J;H& !=D1<ոׯG#7~y 2W['bpmfJ;rl>WʞqaNrsk%Bq1wv3` 6^NI$=  UE-/OOS_LmQ/5f|u9(^}{zW}Xvxxu֙{Rh(ʮn8LiQMcϞRlʹUX_\8mﳼpvI.1y R$Km,9 t4ݜd!6^Nbkɖܼ7t>h%)J{ep"dzS(x GksM,bHS>Eƫ8hSHg@S"f (!ZtBiAΊ߷.3hS Nj5Ʌ0m0&#!ÌN%HDۂjl!|եO2-iL&YxAsgProHh@ %ԑ4IͷkqІWY#MohLI^MJklF( <3RK=mYR.N~Oի=JT"QG &|LDi:P 9ýV|hdbo^̍yo@Y"1Z(O{J7vT;"u!eu?h zJ -_\oƇU q8ZooP̝fu}kA٭ 3-UOpNS:y׸6PAe \3:n2[|KX.]֥7W9mtVfKٞT_:lsO?,.Ym7әF4ܵt:!Mr5 I!*(QD@NJx4$hR⾁XI7~px{m}> M7}; 4|[pfҬ+R6X.zzFeLAr $+9V`e,H!u{<2ٷ-Fpk%L@UW)$o24vM(x"RG QEíֈ=:d8/.}'GuJX︈4yu8q*4x$)4 XBH6LgI?{ƑlH}ȃkn`;î`UbL Iٖ S=RGD; 9.,$S{+%R.wA\lp2UWOhr6>٩:jK ^Ȍ ^`A Ɯ *ٰ0g슅"3 o /=ϓe|-]5f3W]?e??8bQ T(X)q'=%UjaEDZ*4Xk`(L) ɊTPH= JlR!`V0/2FJGt`V!"͈̈m#̌ڲGނ6q2 ):GLX`9K@0;6(r*e/!+F.qVq` !fv4H$!%|A=rҘxxX-StC}Al+"̈zDuZH% +,qd`q@1KHAL6VkQV;g\VD Ɓ35"^2XobAVJ0I%͈̈4:qɮ3qś88" VC:X aшh s"D)f;H>aX {\ \<;G-*[j؇<dxpu$e:)-c1(JN~-艨1ZX(QuS'SVnh5έRȰ, Ƹ#aREbl9)9mg@BS6{d*>+SGv:~$cztM{21)De71G+%k%u;f&Di) %/-5>՗-jq5[EoNogbvow5`\S*:I1!]SʕC.HIj1vB"#:ZF@ fcD2Y+냉41""&ZH0<)c"ҙllG4ƕEɾf׍| n0~̛v7GO Raæcw`z_{9OY9Ƶ\{`D ħ:# (ߎ) p+xfz``0H1( ōAhiQdG d3g5a( L%T8YLOP$% XZYHyaظp #m~s؝U23II'T< ;Ԯ̙7<}ޗel$hԧRg!apY *JE&=t~{鄞y~4AR#BL8-#>/c TpJn P{hH"|v/o=(~œo|֛ǿ->!zdvtf&ͧr>7R!`m"w/FKz۟8w %V=tA|vTgpO&󋗏 aL`~Ns>\~77M7l2L^JOOڞl)·avjҥW.N~[%Fv]y}xv\`3PLO W˲k.j&U0!~Y<҂)CV._t|>Y]~jԗ/hս;MUB#Q4t]ZNJqr\<: Mbg.*ouR|Tgqp~T\z;ɦd34Mugk-ݵhtʶ ^Ym"㮺-tm{>Jڳr6LCa<2?x8p+xVJfE5%f4y1gݟ'q1h\tӺ#ΉW}$ЗNBMHЧ&9/mp|rnǓ 7jtUy|>*Z 5f3vT u+Yn` .AP*F fhFnɢF3 /@8R5X,,p8=>m싪%9?T[Ҧy9Jڣ&%;nP7KfVO9!6 mJu&/&lZ}tߪxWKh7Ediv9oNozSs5cCE˭3'm`2;u[&aӤW蘠 oϡ 턷1l&ui;4[;njɻȱgޟ z&P\ėV Gj R cu Md6D(Zc}QUx228 gK qNbAp!lNȀptq AKDZ~փ^KIYmM\kJM9&=GFp;μqep\วL%=GR`nv> ]9Nˏ^nLʔ4uTH^ w&Ln],5ZmSZy^xx>?/<^5{ĜQNNsKrnW6HFϫǧfN0~au$hHk=Vì2"rG Ƌ0ǣE3ѳ嘃GmkԮg𕷣Vxhd$ ́}UP h"\{p,uJ  FۅLNO?y4÷xçb>~Af`\:z.[W=ů̥ƻڎNK8 i$G#_'uwI[M`~ĨoWOطpm=u954 Mbhs>u=ی+r>rǸϮ$[N;(!v32q3Ub6lvحƅ7_'y Dt*hp^pP\8LiXo"̂(ЯzáOŶriR9y̨aaiSP^L)bÒFΣMs,:{%'v"%'vVlrqgrKϕNRDg'EdP>ﳜJʑm>zM/t*զ9([A5z.H2eBiX:b{Tw@;;nYUyj jjCbIV5VJ naoLl<` |Dexis.wׇѭya_g]ܛf^_}!FNN&:8 ogut('집&&Uc1JOم/ ޅ4p϶DkP|c7T"o}Ğs=([Ggg>xɭFz$ řb 6uǥsof6 mQ[9gCs>0[?PU$>/D}xxkt|۔Hj@2I3LuxDH$DojdvDh%{xqVI"5 VZu Sb ShL5X7rDGN=~=y.-kmē:W͹7@ov#3sy1e:>9|{n\E=ёn&? /W4 %Um^v;띌6'ȁm϶rjnW1K%Z]WPGYCw;Fvw떝v}회g׮sy͌[N䃾A\*NdI'V ;Sۡ\M([W-sJZ﵌)JэӗZhaL7o[ՃS\Qu+#<`?xqs <2Y._ ƿV?+S/\q gP5|u^EO|L?m1U ų{)Ϝ.G޿(I%{aކr?&{El}\::UNVY_\뗟zz#NR4L N 9Mɇ*ϱʟ:,5[TMM$ ˡQ<^aЦG"}Yc4;9ZG+G'us0j6~Oo}%?jüyGC|z^ w ]$s/8ӬM $+vsK;;[R4~[kC&-y0]T˽qYIUaXMC!sfGW/Y̷K [N|wq^ϽRG$ 9hpKkVc#= ycKܻMƃ|Kn܀z[ܜ|M_,Ƽo1ӳ7)z]k~1˗xGBpۿ*.8sm3{'vfxS|\~ε2.%p-G?/Vj 0dG[4G\Yyޏ q@xk照uf{sccoo; ܛk:Q-18.q-6{S,[맳IA Vwڋ&Iߟ6߭Bc?ㅹcXwMQGҨf< ΐA8Dwm\5ӬdA&)xor5>(juz7%jGh<Y468p0 ֬엻v$lcsZZ˭CVZ*y!YaQ{k%h}2urhVzTu>&Qu=4lW0Ihr3 &Zk{/gBeE oDx-`t4ފi,h&x &LBg}!ۚ^3z$&=6BO>yx㲳K1 RdҰЈsS/S1  >܃XI!G|np\+cD֢rJYwb7.ORn1qs vDU4]%B9ӐJJm6=F\[8'p߬#sVPIœƚ`deS$xR}h6$Y&*FcO6Z?k^,djl 搜&68}RF= 쁘4Exn4ր`fYklwH!n/oBTKUۑ:Y0T@@QH9ŀ`ؽ:(i`t? 9|xE6340(eWrWCXT xLz[:8 Flu%d,qAYgE0hh5U S  U<j&$$y}6"U 1T8\F{T;wͭAQNbT>=R6 PNs1O-ʡP0L@܂%ClpX \݈12^IV4`VBnU{6:i 0n`Ɛ)uc 0s;Gǵ"f#q4!v K0`LFHpPRp @L5 h6"#3y!-(+4ho\ S1D\ ̑9l 0 :d[Hl Ύj@k49**RqJ7BW~+Y%%k@By7s[k5S{ܠ˅u+k/gQᗦ;fɃ ŘQQ_Z!` $d];3r2eG[ƍWdZ|/ $-Mgj!A[ԁ6I,}tcW2JWr:TÃTHHv9!m66$/XHh_Hj IhDN !ybufd:l,%Ѽ80{ g 0}2(+=2&vdnX4+j`4® |qL#p5YK!VNb;X' mK!ן⎑'K39$b2L"+xm`ǣD|viSV `}U@@"QT$BwAK(w,e`{5c0UF咹Xsv,H@՗ln1ړCEnAz| q %Ud, kmɲEȧE vz4k,KQvn[$-JLwѱC˪[-S5(g -sgpZb8`XB"eRxd \9vц @fbR&J e/%D j@ፂH("p)2X]Uax#>zp($B k‰`#A[> 0vK 5 ՙ qDC(8 W7BRTOo$=o^cZZߠnJ l}Z e #Ð)2)ovJ<[PTgz<8ɡED]j MfɸSЌGwd5 })poG0B[+Z 8w4p4 62F8Ek$.E6;f#5eb70YrpFsrᆅh0$tw`C@* (- 5LS aoQò P!ڲD[a\a0(*bژ891pIr0:kt/Ь0F8_P0V¢1#IHJ0bZAǐVu@U4*X ,DhI2Ckpcz'/bAG=`6Q& -+{4u!A‰!peR?\s4Y)A33)vvW>(%6HnϡV8++YjXBA;XD10B EvV:V N áD1>D8{:vc4FA}EY4ӥ/hYlfp]b0 Fi D\Wu@x'Q[e!|G.ի$Bmg\\?-6@mģ +A1eOyr ܸ> I-|#U8JJ Ex6J P8;y%5DIT}J i(*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@_2E=k*d@`.А*֊W+@%ר[hT@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*^%. 9)4|@Z2#HK+@_jF9*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@_0 O J,5J cU=ZMQ=JOK d2:z遦B%PJ sj|EJ')jP J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*NU dSOqhojfF/~TMz}L)źͻtqD0*zӚxRj7~Yq4#mYh>(E߼XwۥKű{?ly,pQ ާE1Utr66{ 7GRr('tȨ?MCP娾_]:YfvBe0r9%jV06J8*\J< ղ!eӲtUsI qUl3AJ5F)B]/Mɤ(=4gџ|QMAT5 | o^{ʹڨ6g3k ׋xZ<@5&}tkjԻ,%I,A'>rF+m;`HF.2{=i//^zӎAbY/pxq>fҤq9~f9K%Z6m~e?e2#WTxeū9s4I/>ƙV}G3"BZyȵZ= YeNǤ[h`F~ݹ=w*^m94˽"UVVPaƌU 2̥cFd<.d8*"`G`X8gj`,TO+ʻ- 3>nR{x:,&uJɸ~ 2#m3)IU[^2)R` "L(̱A qV`+Z"jfc36 reᑁ{0s#MFy*=wz`ֈ`wy. Gfxx?oWH3F |;4ns`$sw|#^<y%sfCiMIpR uV lRVeX HX>T)Ab%7IWJKSTǝϔH:e𩎌FƉ,{=;)$kSz'Y eSt nZ i(ssg@BPfc!9Uuhf8_<g:[}3QLCQSVy408'AiVApaVUyL tqCo Y TUQI,* P999SlD<՞Jb(; p>zD>lopzSdܮg-PbmJoVv}72GEs,Y`9{$ 1v E#t&YeQ gzy4!bRBB_es`R0IбAiBBDy ]waR5gFgvRbxg<][YbަȤwIzRX#EtIQ‰.ao rBHo:mjgCB R4td7 `@|i/+tpg"-tȽǨkN~ p=bDpkz+bR2lpp]jmem+{ڶ5uE>8|ڸezxqכn^pMM~Kw.mߍGe>NywGCP *oެRҽ+w} Y4ybvuݤ/UW? RWd)ۼe}Ddo=9]C:^ݵ$H%4{>Do}aˎ#V;tF}Qwvy< }Fī%nVevTL MuFwbz=oy7]:||ӡ(+&u,=ԕ{AxXXFu.QA -;NlL#c{[yܷBmJ21̝SLsfhpJ[kg0łO",|ʄpyFL'g<> 0ωo!%ˉ'%Y7%j NĪ鐈I2A@x`fd7].\wmmK  (A0T߻ ?d}r6H^NrD"R,o"J&%jI=E3鞞﫩֮ ^W}ƫSRjq}5 뫗Q'MolsW;HmNmyӂ𦍌wHE޳VW=n$N})|{fS[VL""L(ɠ61MEbR]Bpoc( f0HY  TNItzJIhd0Z93WVΎW`xJ֜'ВI钰,Hoŋ'lDʺrwYhr;Y4IN<%fZ"W^i.sOe-Z<3`C5Sks6at5+e*Y+15IXe:aU:,'Vx$H]$*05(I옙䅠wG}%Qj&?[y̕7x5ŹAXv-$]6]hj3M9Mi曼Hr*Wpadx'r=٦u1ĢYƱ+0+~F #2P2Lp% 68FJ.fGKR4vo<ݬ %ŋf]]M@oduIƗ]p&7oWėCߌu:>K ?ތ:MVp ڪmcv:2]"Vn.m"[qZ CM e6iqt3?c|Ϗݎ.Oެ )ɯV9<Vخ9Hh*||ԏW;J׊m-%7tp[Y,1:t:j2-|2}\򞋣 UouyZv>UrOK) k>Lp;Mo&s'zT{NjhNk~x ËSG˻/߿~yDž{߼\@a_ucðRzݕI,Ƕh:wtߨ3+b c_Y$w4[=z`lzL>i{Zj[6-ڡik^bwiԼ-nʪS!0~@R08uNn ۙh/$5~<)ZJJny'SFVgG&dVE>zP"<Kv! ;ڹɳOO _K,_01" @>,A}2]J &y VI< Latv]cCGto̓8'cb+4-YZeRAӋQ jҢfD$-X By݃J iB<_&tCO)Je*\SD:h4WGn'>ve~nznLԌ&E4R39St^3b1^?jDW#Nk0HыN\b:{%#3'bOB#Eq$ {d*_x -n/}\k}Vh֘TqĄ 1@W\0Nh(XJx&̧ϠR5(Mfo=%h{ B:!C5&1RŕB ZÔ,II7OB敒PhϾ|I"sg)RS1rI@f\Cސs.<$U,`J՗Њ%))aEsZJ7rWøZ&]gMe 16Y#՟M!vrKRbRpˣ1i^c0h 2  4i٬e\9-]2|̿d*D$#\Lh<mdT$TLetU9,{iSLd%@d-0IFZdsّ5wdde9֝r(+fuLR@̐DBYh"iD"Gҟ cqT J6n__( t'm)FRyZ)BkC c.I~3d%eU `=ԪI=OPSex$ZwI82I%8D8#jI􊣊8Eh;XG;%&O2)rJƨ n$ziw('١_)·|,Kb\Ae!k*usk\UļO)u )=R{ X^b6G# ɗCɉh02v)I9 mE(eYm4ȝ N{g?t| t-fēY"A8x<: ]Wu&_o ufh/P $v)P3@E㝿Y:o qW뤨_ye{̎4,JB%Wљ>c,OCu{옄b`-><)Y8*0gMza hĵІ[.;/j{mmyS6mpEf`xQh]uhݴ7YxǓ߿ifeLj˯5of6J8=kxѝs}89QQsv-}HYh>XMӯwYzfĬiboI}!Q U!kRw.!lRҲUGfwJ4A3%s/|OիdIL׿Z|bSPg=#@LZ`,<21{=G5ꔸHb{U7IQin93`8z +8 ht)[ X18v`Ia!b%k`1 Q!5!:Rb@FIO1@I1kDyGgm_pnQܶtݾ?}3WVD\@| nT1+,)| ZT7N/p(|PEk\c. åh*"S`^E"*jUk{U ˃1eq7>ii4 bp|]Z 1(7d7Ƕm] #mi*2Z Ƞ/-'.32 mn:QMań֟sxz;/<,`e6f Û[7iyo<y{IKôA onpfm>&B45Uc |fQD:=0[O꣒RGnKGxrL@tRB??N:rT&?w= P?Mj,5(R*VKxzd) ZG0%`a}k;N$/Uѧ\UJQ ޽=ˮ`t%+!L`1¯ _y)7m0,xvmkgLۈ +mivkjqRϧ f@g^LO3?sBiW#6B81i:כLiҩ]fFZ^mSnaq V4*}D]x A:!2WLv7xD{'rr֩Gyj^PV?̺47E^V %f݉ZvǕ#f U(0+?@̺ 8h*ʜ(88l̺9Iy*X>0LFƅNs>PCd~7Z1'8Z5<Є/JMR5tZƎg @#L9 )($\q|W  ErkrTap9z ,H}(Ay^"]Y/Cミ9d#S(*>~W2tz2,~̑Kv2OloAUD`+.w*(Us+k|sr"8[1Ў̕~\[k VN|pqviCh]O=]놬F,2DBG K%_]& W6ZMr]*W^:IeB%#cp\'!ݓc ˺OXj wecA8s%woy7?۟?F/*S~cN vgF8d05|IHqqzda?]v0Z0tj;t@H1=$5I퀤%SGP{5ʘdzzPخ;f-E߯m#iN A]Gn[0.u25^8.W(t0)|} ead?go"zXG6O54U;,۷T桱[6}2 Tn2gzsS :-j!PUj Z07ԖM;HU )@]E=)GXSq?G^j}w GA)G!cL9 [T<צ;t݁> rnϱp^k rhzHgHM.0[Ŝ"Pc& o)32 nD۪QtX7q衄˓!I&ܧZ#Az$B.RQ ]`]%tBT)K2iW,Z!M$i 3ˌ3Lثl^I;4&wu, 'ehTn(b36\yp ;ΩtNsF4\DC3+n| ։[C:( =1++Uxe5 ڕ)<(щRF(KI^]IPyJWgRk.>JX:7YeE88MY`KXc̵7 g;>{l=݅o'$-xkb|p! ^Z`}^ HQ|zڪ[ YC-7y*UlT0-ZԢqfQ{xGgh 2 ( B<$8~Hb2Aq@u^Q($uJXE.$Y& R #D3%g1b옛:؎GɡuW*bg?tz9D4̄%ؐuYp)m6N;=OsCU9XY{ʼ(1Ae8Tu eq'sWd7]|m~}uo^&t9k<}i, n%+ rp ,A,HE_,9 pn<(w:z!E2zn:Ird㿫 g?ށ_F7y]\ڛկ[nk8EG27Y4jU./"[ʳ'~} Ɛ)o*9ˡ Y\c{&y &yJ~ceUyq[p@PGVjѳcE_x}xtG?K4kqTq*dB%ȔAW\02 )鉂%]Sd?oI޸8Jՠ4] ~(_u |:x)ZeϴUilDtt-$Z9C/9ӗ=VBxc9:s,E3 ЅF$| *@s`(s2x*Uu$6ӣz{,C*9#Җ6xBىO6 (X8}7{GMZlܢp{1HSd'} ߤWrGƭN -2OLg2TBtrC|ήѠZBOs9fMagl1tObvX2RB݌pqacDA S?\=a.T2߬nGwҒ9\r~ |dO&ESk0v1)*u I#*o/YL.ΈVS39餦{𬧢O>J~Qo1^#sa ]ou*ob ɝIW ֻ'ީĄ%L"FO*3x"hȘ,4n+̆0Dn}Nhk9Rn-Ie)BIUbM" XKxNCpv}~JQe}Ma]0 wShrDRp8ɂGƌM ×$XDۭ%%ȼ.D[-Q+.[,`bNAI$n*qro<*s߹){k ,^btĉex&{ZB-9ߘ@;Y5S{#^׾A׾cU֡r6`?{1CyMbX ,jD[}jf&Zޛ 3iF@h M@b9d)T\G9 N> nml1!EpR*/2'e =P6vLjU|VCb{R^uC):8̟ښl +w萆Qe;j`خ5[ft=[wz4\[.k:ȍCtnǾ]^/䔆A_1>-!nuwu;m{dK6koͮCjRv&wizx{J fZnڵw0F :4|K3Gi7ՅhotWB&MlWۼCrrȼuc]fn=7%nns`Ӆ?_ć0%I:sQc@Y+ `Z)%WAB>V]Ժv{[ ݽRYz cȃ9^ 2y"cH$ST1;rPWݬs\>瑹yp+b u6>`S5qP$~!aW@iV&u~l:> }CbuGG?P-k MۖK JŶGZ, CZ(󅜾ՎgT0,] KgTN{"WUAt6 -BT"X/TiN]3r]~,n6mnzJU w`SLT'6V:oADXf6}~0{M<X( 0cv(!gx.YȀJ3\jrSf>zdKYb.7&l x6JA ^0lfĨb. }(HF:"  i@:;I5/zyDHIZ: :kp+*]w: cgoϵkSOTvL<~Kyȫqǯw=^wx|ѾޫMjykp' ͣH£.GT"К"-`m)P<-$dzMHYyp#=ZLFe6햱Vf IƾT+.7)a7W~gE@iˣ3ܼsM#wĔ!SI[-⌕`:@hPR u+QJ'3a!,d'dBb6Ѩ`T혃,bʘp[0%y,Vtkmem{#؍6kd]3d YAJv%f\It8X\C&d(ђIȢ(\4c>D% g=Hl1bǾUEܘc?.LeX \$]q/ ]觭kHɉᐢlDɣXqh5rNGU&JЎ,UɗHEdø{CE\SJ!MQKĈfEH*[L19.&K d%NmĈ887#Oo :Wis2-+.qpiLUPD4tF(T-4v3"K&gp>pdqG<1wk<0gb:;|&/y7A<_񝠓foϦ()efXfX5SmV{O~xHR~K>TEN1Y**H‹j][¦b弟vIkJ@ ؚV}e2aID0\2F .'^uJ3zv R/yt*^ g5kC~5jj>w)8P/D1Y 1}N#[]?Fu Y)T)Y! f^J+jjH}U7sW{ JY*ұ}RiD*<%"Q2#52f- Vi sSzHƯ7qrGVVӳ}>}{jmcLowWGY2oJex&*ˑPAWۉ`"ʶoRΒs.j|0|`*(6R5A" IoI:\!r/DdQ iXLʙ[LeHC*q*itH6Jps30cmH߯ӛ=/^%Ӱwܜ~\d{x5iluɶDXe`ʭ*ji8$V$" ,;UTe1<by&gz!^W IV Boq&$WR`Ta`>+ "L`Յo/t6P 7wW=[̻,Ӏ;aճ\j8?9vu۰n5"|ڼ^, zI!qȻ-'h!n\/?4k{ZR{OO^,r !O=fIx5yoe]~CT J{< Ξ3xͻO\\ԦEW[߳=7[.d@qm'ӝ!m7aqǺ?~nx'%7*~)JWFg&wǛǒ s[-NO~8O1(mm^C޾;l`}>!7tÞᶑ/m~k 91֢ۜnSN^I [/?#Ǯ+vb ObG~OGK/l{ѝQ˰zH_> j~|:XVN49j;]cV oY!`{CawO\[Q ҇Y!?HUz/=y?yZM*- ? y[δ3;M)C=wzn9X7nޒ,qTUFZ%`Vfe9QI#S,CX1~_=}Fg4bDh-kQhcb~K!`a&MEhVۤə@okY?k`JCLwM&N~Z{B^|@t>pzv ?5mWץ^v/Uy;)[ [Oewr9tS;ˏ.-_/;̴6l9lyzR?];]wȽþaDl^gj#-籷iqz{|z߆A{tZ.};7=__O>[Ow^mmͻݺKڒrZStpO{ -Hc = {#í<߻F2㺓xrRmKa7)(b\h%u΋ЊI{)N3\wzzCdrἏЅR"dxA!,SVz)q!bȍu6i 0Q\_nJiֶ2*pѵCbN O̅$jPc*xҌJu{ŷUx0KjqAcDrULZE0)],{]Czj){"da*!h38fx-qB}uNDqBѴk! MPf,o GLBRPU RAdv$X1T@ߧ7`s1 4JkuZEAN/۱ h!~7z ڻ'.5Y8B"BKk2d>B#PǠ}2Euxbmb1~U%FBVkΉWe;QUO1HAY"}-ɻ)0y)(Sp6/ZJX D֛9ZVjp)BȊ-%CC|]TJޒ41[EQaaOdHMJI9G }RTaK P>冠FkE„<*uALڳ$J (Bjswɑ*&.MV^ 4@)e[ЄYEFGx`ZSΡLť ngTgNKBIrX˃jMn]9 ڢQQ TU&1aa=e uB[8iuָV(ڪ l(E H.9m[w`Plh C 21aEVΔBqaO> UBS[FL& KlP`&8<8`MhjSX\= _U@i"j(B(%6*i$D댉P0Ndg:cA Ҳ_ n{1B/E͘UpN E+Rv0'"B:}LKOmg,rrض'M,eU`|UM}$MuKK/P\r?OБJ29ff,+o\y&"ڔ#=^%D :b$ZHWsX/@ұ^O=pJnX= !JDZس-61R.:8`Ϣ~\jl1&)7s&"Ix B,&* X2(߁*QޮpD4J"x"o`@Հ0ݦzDRfn'-qWxPu%(L )R>?XHjQb!ī]J1C\5"0i0-vA)/40WU0.X6OF T@ڧY=^ VRQ 5*`!d@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P JW č䔔@$̕d@`zJ RT}J iB%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T}J UBSR0Q<{%blkTJA%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J TU@F yBJ 0WQW¢+Tq b@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P (C .wkvpӸVi=MxR&~&"0-nQZ2qOwLόSnS3fu Y9j^ߎg]0Kדq5?sy|C /[~WJY('t)b}p΋jxsHm]Mu*GWgPĠ`:_V>3xWQ5{b\441 4vpUxէܣ| ҢGUPUFz!r>^Ww}S>Gxt_(lU"W"`5[BȮ3}ܷWFYZ|͡QT=a{ }-N g,#D>ߐ 5MN.iЙ a$ο#a9lsX`=Rg:;q}lOgepf5?1??RĬSҸ=o^E}9qp ?hZNcT"֣ZK@7ᅃuGz?9|x^q_Z5T֌IPr +S ' !h^ y W~\ {'aᗗJo>?2ͮGf{~@&'ń) s9̙B}8.< TX.qdxx_Z%ZR"bQs:g*EEFZyƬE)zZ_+fSNj%/b|;m/Q(Deu-h@ij7p`qfk:W'^\^%|ٌZ]Ni?ǣᶒq^ Pa Df՝2m; A}sץfkҪ{ul+qIp/7(aTjީ]fbC-ܨuYUj,TBYWyX<dCFjڽN|Z?N#`#Gطya&l9;V@]abVR[${C6* yњEv #)SQPrG>Lo$_؜{tw[{XӸ<ů6=g)L|hM)5 ɅhFG7agݯ.7ҰK`%-wf%U%e%[5g n|gx8ȉ*a}P¨H![Lj#:B!.`j3N>8Q{3gDMhiV"g 79?;#Mq'wo)Towl|}S 6)o&.()zo2'+$߿d{lǕJ9-\{6"UVVPaƌU 2̥KFd<.deGq7sv#c\7,3B3*'¥EmŒ7Vq8~YfݝߜP] ?M?0b) H DJRADc,lՖLJ)@ R0zElnx.J"#{YMt({]:ClGL$,D)=#voFl?ǵvoq,jQ[#j#MJ.0lrQ1F<\vU9K2TlNaJ"d`TjYɕ=Dh1i"Q0 rI*JO$¨)\ݩGD݈͜2P{YN7/9mϸhy< dDC@DCNF )Qɔʺ:5Jdŧ炇yǑx;–ڡjkΧ'E&^*Wq;_^W2wg6dFQ'Aُ_&Q]hُqrNI1b>JHV2u ЙdE)~g2_FNA!і*X%RKx\0)ͤsX0` O4!!"<0*~^ ?qgrJ ՞`Lmfy"%*)"„]Y۬_=XN?I_OֻԢc?  2 Rwd7 K`dO|/=%@547$DS 6' eK,*TF&eF&h]1nt wN"|qki /L{>PEFnNbɣ4_ĕkwl>%mϯ;]K8.>XC3%l^K~u- ߅K2m~.88uu5ܲe[[uy;Xkض=y馃xPev֝צ])|p1/ԚYJvf:)V[ ׷[npffl}&k; K'Yi+9d(|/L>9f3{cSQC{h+=߮J8z1yysbUǮ5,mӹu+J4p fd4 ׾dr]w17VO7Lwc~e`ǩ,z1l/+Ds*A?I*!m/]ɼ.Sp3y7Onn|Ń*L~ =d1[rvDd jdC.iGNd{ZGfdUKdWbN{!6~Go [Tg.\UjhGl"zht90?]W.I{sgf.H3zBcSwxuQ+mXP~k[:fMh-ٻg;ykn-lؤ>B:AY{Vwy18'7y+gָi b|>o@{Ng#x{}bîwO—nSQfb3CS:-ٻ6$W6-ՕYƳވEeCH_?Y H *MuP"DUw}yx4 Xu ̑SFg/Ϩö ߁S-\}BR4E_a>l!@du1zMS};̣f K6mf;KWkwąUOQj'M{b=ӘKzv%ccaE1q`p5i!!"ZPbd|e˜eG47[ese|]v2+ؔrE,9kXYG+UvMѐu ݌y=,~4zrsMS.-4&U G*YeD7 ǘmP'좀"db+(׽m3},k7wL]tkHמHѪ;Zޔ|A`_z?-L岌SUCF=H8Puh19~8WkHewtzUFkiS۴:=JۦN]?kq6q(u\3DU6'jLb"־6fpƊ ߾+Wk2<7>+y&l-2_&[88rVP~t/Ϳfgǟf0M27d n:-v OvJakq}C7: i iy5i]9p~zrT;.:PM10<7LBE@]Tfy>颬'2 V nلn3?=1Nj?*(E;^-%IuY$ "Pt qJ!`RBnq"]_&8CRQVYV#T/ߗ/'*ɕ-ѫW/?`4(k9^Kdm~jNX:wB^noo5v呟~8xupf3m\j3Y=:|Z$"j/luqNM+ݸ+}شټ \_J15;ۨZ͓ώ,/՚89u˓ܴsEVPxYH[D|~|:oSZt׃ ~_XXPcP.Ów"~}7_zÏ&y?u"W༤fyœiћ|YYj_pt_3+6=僼E9i!e&|/˚ o??|i^Kz˥mpXgY{u~{]եN܄NdsvcۛfvڌcgÓQ2Ԧ[RFw jY[]`HxQv%j(Dxcekc E Z%L!Ed!:1EyBM +@6BrzΨ*Ϻӯ<Γ/Cuҫ곛jJONK;ˆ^u_4=m}\(08`3͔) zfhɳ͂`m"| sE! Q@ja̱M֐A`ҢF⤝e'y;Wy =^i1o^|G#_mqݠ4PAߛ|^8=# vW㦼\߹8*!(pTؚ8V A Ex@\d @@\c9uqsۥϭY=JWiWoؘ̼3`* B|\I`dPj1 XDr6?5gMEyIÆx}kcp?bo5v7{7t7s9\~RO{Pb? kCC0\#N.)λTa94;;u*UTn|mי(q,EV c\*2ѫ`tm2})]b+CHb`]4g j6r$bV `["Q%b< ~i|JOʋv|763D޿t=$7%zb&'xlÓ+gCճ'GqV({˖zD./7@)^H/NM@h$#(bx,`0$ y3*UʈJ` Ѫi.&='[|tVk=elp0YAP\t*P{>U]!j*f蒱B]*{|=Q2? WI1R՝MLuOܕ=D[X>φ?t`J]lLFN;<#Ax NuUxV{\ 9ƔYYvhZer L/bRjL3g 9݊6zrzq?{ ͒ɔxqn9??m6C{hu}^|՚)8eJqSՈPGìR?HCHԚXC!H1DBdB-Q{o $1*vn۹_gǧuSձ'_rfbbaݢo_fyq$Wː⎟=HSE9ʕ/.34p5Pq+&ϔJv~SnA }eQw~;惪mq!Kwع<<xߗhL::%(2D+z",AdB2"$,䉋%hT%DaDa jZr@է/74F B*:Xhm`}ϭݧ0FWڌі{.[U@HK1+uU!0y5]YiVxfp!Nl+i&0"CkGQeC5-Uۤ!uUt"c(ΩlSTJ\ԪBlcxu B3׀vVAT*$nvv^U{bR6ASC Nʲ e:6~ރҌ{N$ XE>6bPX|\(ڥlʨڋj Y4fM*l]_ By񭒓=& cL\qy)0 =0')7;R✍* K,i .y@&-n ۟U +gzr/EMy0U MoJ BRI4 *)cR Z~<{t4VP-bRd7-V=8vNݶs?j6tDtȪ.tHa]L'_| Ѻ3yrChs_\@sY<6Kn%֧D2*tUH[{ZsY*#`g: 6PA.WADTM5?n.2uHP^m UkA.:82w {R BbUlg=붝Zof!wKD] d˜-V 3,X'i$f'P Vi,O*S>>_=˿"꒳x:kP2e>D6̘*Ug $j. juZhIOz"OjԄ"x2`!UŢ3Qj-B0D 2) GqV/N0La7@Ct"aE+ $xQI\uں*+c9;.Nкɍ>um/tVeD֙`X}XN {:&2>R].h[&+06/cюnvPg1zqػ6$WN.=\AX5r$J~ )(jdذE4{vrIQ7)& $ @"t CG 6QZR4Y`Ѯ}XE[puaJ-D!)7 Pq}LZ␼:!+,d>C%'藳B=t:h:-7}۠+4MFU+ϐK9@oap)]?s,N0t5A#}5nunL/8[4*?G4L]Cx! A)Cl]9]'Av- U^>"[P1&x&7Q>㣌 j@ZM:[Z9g /7fcQɔbgȓ:~ dm56厎v(꣹? Τ>RR/B= OG~~__ Ƨ`VDe5]7;_ŌU ΢?-,\^ v~-sE<foGm{G?sCt/Rxg?Lx+Foa]2:$4ĘZ 'Dc!Y* idUJ?dOv2YZG-`S";E-0y3P 6b,Ux.?_¡paq&oMWX 6 ':YOiv|vNp :OQq=/* 1.<ނaT^M` skV P0O`@kig$cϼx|5w̿^>ؖakϭ1s̔y͏蒖=l}yf6 S$05CMt簹WKkUو'XS|9QR.%1bS>( 1W4z͜tFg73Chc"Q'o3Hh!\Pa]W}ꍜqs[t?t{7ָ=Ivܜ.w$V_؍x~|'y1ysBq^w>!)(#%hbT@Ccx %| o>]-ؔ9ɕI`[# cC5&;`(cSEW/6A ^O1{ %rMF17)@&9"h$:&J\J;֡iI*3"ܬ_A޶9;^fW+~Wr,~?Oob sI$Ya Yk0jINf!=f`0A侟Ng_xM;\zxwM`aXZ'?*ћE*j}k?TwmWxņ[Y9:*rFcŵ2u GA&A`#)icXs&d!tHAd!tHAd!tHAd!tHAd!tHTAd!tHdmHA?/Y PAd!tHAd!tHAd!tHAd!tHAd!RJJМd?&ԝ_sx,$ǡ`U:*V0 R) QJ*_9@C4M^p^PmA`2^~x l]c5c b֦wKnc.+MXs-]ށ[܅ 5cJ[iU!ǨȎاxO[Rt%_2 )){NĈv!Ă2S5ZN,X[\FΎsg}1Oԋs|?|d6lX *encxv>)b] j6,k>_[@w},w7a?|V$rX,U$aA#~+TJШ],pYIk((4W 'EAx˳0g<`c$\z@KuZQad Yd0a~(݉ /ijv7Ԇ\?&Iz|QKzGoQ80Im(N{Q!Wh2I.9o)'蛜y3y\rKIrn(t~zx>.Cws#ݯ/Ѻ'o8Jg|^s ] ՛ yEL1ZhZEqUzڰ.e}Qm^ƽ y~O}xZS\]G*AsЫ yy1na T _oryכL's% 2ǐL&.{D^^,\o_fz@BV\@Ē46J#zy.'v5^i~j۫2zll< .qɕO:͆O9Ζ!ug*E|{5Kxn&ǩϳyH՟_|OV!lwW:2f~ƻ/LWFmgKeGd&roDCv 鞘ȶنE9@`f6\ :d9+vJ|26Ρ-:W')TΞ\ 'vxV[m'&M7LN;$/:6a7m{̚jؤMgu`bzg"5 }})uKi>ha³x6b-5IZI[US'*M%.`w1a̸Yӏkzf1+)Gگ{-3v'cq؅+cgB 33r\!w#.e/t4b?,PkӞMT7)! +L i{lpl/=a/PdKU=iK"k]d>x1;i\3ϤL3nx+UJR@ِy$wY~v'lͺ2P')ϵPk6x ĄZ3ٌ̜4r^Y^࿞ y~‡m@C֠HF0~z7),:i,C0\ Ƙ+506"s5TZcTPsK՝]f2}k}H#"iMo.hMĒ\T' >ipDHn#X{hhጅ9GId׀ޞŲ2ej M O$Z}wky^Nߖ0M}]v{lF4ٞBbm9nm٣={/܂V[6`5 mSsb; 4ǘ>ء,x{ݥb6:󔓰p.O{@]"E%E$<cS8Ll]ހ>+u>yϐ>e#XI%#pDL!LDO,hn0KaZ`56Ix\vޗ]#6?xN*aw}q8_=_ TCΛ\eWr) *HxGhLY#HGCv7~8PW,G&j8")qBFjMIb}&msL63 c8uma P,64b^ JvlYBQ'I΃G 6%%H ;(,#.ʾ,qꒆ蚝% 1Z[)g#*DiB:fZIjgk!Ajtd6#82,889>xOM7woٰuqpG%-_7L䷣3 JiNH6?j܄$Բk2u\S5vk#67nz 6Z{QM1Zl6.N&( (A$hjb9!(TCa=lo圩>$p$%c=H‒r&RwtvRDgײe!O(sS{!:홰{ 퓷ܖO>1Eg;=UƔIJĦQLP ]rf՞dlI%M zMC]u+/L|n-ۗa6LX٥(.%CBІR&e66GHVG{û[>WF}FQsӟuƴ닁'uE\=9=N_7-K-cb=a?wyiZK.2Ec Aq1{]2\2Z-^z{Δj Ύ4֑RSQ|)+'5s c%;wOztb·;l-4{a~\QH1~"}FVgL4G[~J_62]=|/c*5 E07MoczњEu|g>K] JzR8k7ũy&Eyr\ 9i}oF,nLJ)w/$ 6ӭUrf)󕀥lNre mz|88͇g^?_U$ǧ/49 &Z)ڙ띮o{U5R/o5qőWwv«httۋk㵽 QS[|Eۼ\#Y9m跎摎 CutU*F\HdqB/?yx|x~ɩlՙG^j>! VM|O 8}<&5%Gx^F4`q,IyQ9J ٺ2󤯝p .{)=޳gţ׼Qx|%1z˒hX,h~3^ p;O^Io4>UsGF{ (:漍PH$6QFB00,@@0clPo{Z=cJ ]{&]8$d%-giShK -uCf߅M(R 64.;öXBDn_.d|sLæswzڧܐn=iyz~[VKo\W3 T_7t"8CG>1bRҤ?Lp]<L)PMQ#O C<5LAjB(q"fp>YIзtk쑚 {Xs)zNV}G)X!ހkH0{3#׏9JK|o\lQ$(R ƈæc%Y<<i}j'Sβ⪻񖙻>w%qs|8[]1c5bzOjƅRT)P&,X'EzC_\zgS9]{oj| Z6).^VU\9s5pY$xګF+i+);{x eqzhkFo*I @`% Ɖmsjٕb8Xf LQHQ8cĺ4dMMThaӹۃoYNN۶cOֿfny[9Ebf,GT]2g G1Tz[覒,M9h9Zb1+T ovkak>x{塷}+qmKcSy>ױDsWR|JSqF(ΐGOK2g`X )'#Hy-XLa8Uv`80SM! M:8[t@yVBz JHjn_s.߄PړJt+K#ED )q>VmdHΗ9o\ iMgiod S`MuԪz@?sн4Xp()XAfmt(Ho%%@]8k3$.YWW"yYYz]40VRljl]1G.(E8z2+Z QkM9 O=Ƨ'׶vܛ->1 $e72зѬzx} g HPQDyrAENⱙВ@.}rE2V Y,T\Lu%%2ʏ5`P$ dltv$] bzx_2 IUbsP XlBS1ْPOV(fqߐԬԪ!X[-lJ+qeQ ^ih+(3hI3Oz"OZ3ep]?%s D`McpYW/J"fejǐWRlHnW.4GTcbMRMx DgXjև2q R6Ww3":."ݻ^ 3:vbbkɡÆ*5,z)WH)l3kIX*/.5Hf06gʚȑ_a> @#ӽ=30p(7AaRՐDK%۴*(Tߗ&!ͱp$ S)/HeB˜lL&Ŧ>Zgk7ewǓRJE Eǁ- "08 cZT< `ՁG*zHt1xTStT4.TkVXKk'-%XzYxe/m}ySe85jQRqLv!r KM!pV&p+*CG BN,fN&6үt5_1i5{|;7LPbn2,~i&|INg4D`L#6X3Rt)[ԥoEwSٵ>Fb(e].RrDF;y7MshpD0PEǬ1qTU `Q:FJvP;nӹGm M@O`xr~|_h라{S#8v7ٮ zČxSM)(;m%.ʹסE6 YiաH QeJUV#F$R/OɪHZv: c1~Vt,/ Gv7ܖ돁|!IXtO*;Qd1W3 ׿A_HCEV< _zϧ"`y79XZ|(gORIK2ϺĐIMNC*N'C #K9TʾJC㣩9>; 2ZiPfX|@C 0|9=#xSxX[ON 7+qC]b(]D~(@o5EdσYjuvrɓ0O~29sCX%Y?/Z|ys^5i@o[GНtd篻4۶[\^I\|ŞrJ%wk״8{rqz ɏ0?.fy"W 81-qa/?,k&`C#Z[ d1=z ts\R<7!$a@`ӶSo)Uq+G&'G}^/'e/dG]JhuTh+N $ QFqپ PۖhѳѴ͹XL)(2jkKD(%VSFtyU( )E;/E5jLym]:ݛo (uN⇳!|bm8 *b0Dgc"@b|$!֑1Tj [6 t oˢcО$ec&V,]!o|"sv2dS%RB'u;@/ ٗwm7'HBsͯo.NOtVbwu F8pG0h(OÇ, ]D,lUg'=@gbEPM1hDd>G $.?5^puY eN>M&糹ZÛ݈̅GgE뉰/s\o Von\V 7{eZ) z,VL|+ə'1c1;ߖE9ݛ1Ms?;4is_G*@!?:!^C.XSbA߭n@] T]ܨ;7z NJGHuǃTZ.=~iz|քt.d3? Ez!AUy.hyP@Ez(9ZmVQCB=q"ւ=HBj̓Q&@C%j]dTQ3jbxEM(M+EQ&/ NE2؛ܽ!lu*1 TOXZGL@[wsq6\IiZ4[$9aVo9{'- g} ln<d?X)ŪޓۢzOn+=f m'WY`NmG I.c0L5~NRv5++F4D@o b.B ָ j)+zg3F6{f"Đ㓣_^ȃiۧ|__ם^|F2GZ2+cǪ& 5V[$+֙@lݴ\ P~[J ϔp源1ZO{w2w ]zMyg{ ?ǾggOG[3dlA'$ 5y܈ 'd)!f `bT)t5XcZ;/ l˜Bcd7 JU+t8YݴY|uկ[-7p9ɼW%zUZ2'Qd0UU99Y)y!V5XI>|DJ%`zU܀(Yx4ەޒ-mƅ,ݾKzŢ~_C3S`)TE8BQ !ze!r9^oIsqr z|*Ճ-Ha8NdwӁNР*D4T`CAĚ"NHC'$h-8W ܂3iWT&>Utɺ*"%r]c௼ʚ&jjuPYGF#,ZElfњbX CUA0>*֪".fOv)\뤤+PL(a9VrtUA,۩,\0VJ E]8jŐ$g*0hDWnTPcgP.3VϤMT\T8*ڦ eTqڋji,U8=G7sWahsن\P*Yk7ܶ@OQe {UPY Yӫfh~'3[Vg8AkQr̢y&h$[ PUzշ\P57`+-\4Yʴu'S"œX*0tֳnӹm]Oٲ-*%.r. Z#1"e~v99Vp ViWȏ*S>_{gr'}%g1QfcɜQC9TZ(m`G j'<;yUf8h!YMZ( mۣms0GIYLh8jM&^H+*N,DtQ$x:5ց5 Ecع+ZVLn(_7yQg[&{C=L:`k zR!>XOT36Q= ˶hG7 yvpg=?m58 9e2,Ny @%,dI6ѪmxFnw';$ Zl 0D0apZ/<;i?o'mcx͵UZpGe=EWL%A3BYV`xr_BeWٗ'>u]S;>]@%0)dj"ge" ">;TxT.ʻ@o/jdbC )OWU]C\#y&~# tw~37b?LONش{`>܃$'Zva@&YgX}K-R淢ֻ)Z#N1вA ͋.fkrnN "ك&߹R4`8kD(c֘8i*m$Ibq@fnvYwY,p#G,iE9g0IQmQVQH̐fYUUw=<2:#=b[*+x(F|L@qլDk`|8 y^'zs#8uy)Őo0 z]1^Hx.C[|_!h "+62QD 'bb%egCvlL5J85"..zYVR")9LY) _r8J\3uQ9m>G` 4pzb,".DhJ"{C&RJ~$*17\dBDH }|%S*׸_H7;oHѝ=b9Ȟ UE q O E78;rRÿ}UUE煛Lg_*xUP¥^E`V|<o_|A_/^+^~)yxq֞wl:FE rjwӦ핛aWmɭ}x!v5s^= +Vɳ㉰z +B`!ѳ X^*k9B_k'VvִZ5hS Nj5J5J7Lg a&M[{Ր8eLȀ1d⹶y!yh;(S4|CBPHLAƎHƎ< {`;C݀fؖak#m %?TWḯMz$ yd.FH[& u {(]X6.+\Q+@/yb h:/MB B=MxpJ&fZ>罥BeĘ3KYyaNGT *GUS39G@QjcýVwhȮ{m7dGP~Vt]{?%N:y*F桝m/' .N|zv>wP/E,:E)0E {^އЫ{t~ߣ6k3~ZYe;3@*s1cMckw@-} nŐ!Ye@T[:2kj֐dޒ1jpd%Bɣ=SJIoNzW (㞚Η5NC(w9Ͻ0me]cY_wl:nc]mn[{&ߴjupGͽ;tjWYA*=V G$mu^rR)-%iVۆնcdG/Ɏ#^FƉ"sK6F$]zs72Υ :$Iu&AUI !#zj1!15eZiS߉5z#=\= }Ll Wz8<;Lb1gw1?\/2Hz&eRkʙ L&NR6"⢸R^[s9Nb6YE?18\lbSXJ F2ƥ $8 Ir(͹7  2 H%傩DU p& Dl7rvt{| _u+ߞB~-vg~u,ָJ|fz0H|b=_K#؆(mD#db1;u8Т@MQ(  !'×wړ UJ4)иq0 2\l0=minϓU<*e`|b䝒@9H2Ed8Ҡ6)ӏ9>yVSN)3 AV:j[Lz[U RX#E$ Rߛa\j# }-E? =E{-֋|w"{K}(,F17 ObLNIȕ xP{0' j 蒳:x*.Y1WDip R`@S 6Dll$ DP4k 1 ߉Xy)HpUx煃DxG :1$U.z7Ή*kNYPW-HLCjWut}R-Bp2^&==L`5o{.;e=];^%i-[RGToֲW~aa);P-ZL'*qiznW˗=Rkzcyu~Ӗfۥm L2亍w_umn»1kae8{6vvﶋ~>LLDk Zʾk)fE OD$.Ϯ*=llJybҥ}L!RrR{aP2V; 8ޢlw.rC<*7@\@ܿH '\qڀ#]VHz7ڏG|C< UY։KdsXe}\+m"%ܗYUZSr &Gm0*{%.ީ62}=i=KJk@qP曏 :-omTDɨh:sN}ބ3VÏsS5IxX\2evj#IQܿւxy7fҁi2<ﶳ݇zueٔ5V-^=WWhn_kحnXkDlq# s0[MТ,zr6>MJ#)aRda%PJ-LL2Go@]zX<>$i .C^QJD{E锨G#,Jrk)X>՝3{Nzqy~z[ 3_/]?xuh ?@D p:a)>}+HV e]U`e,HaHcgj0z<D $Xe!yB2mF(CkMd.X!"u4fPi{Jkv#fX]f_=:Z%w\DFuD<:Fщ8A` 'H pR~2#:HczĹE}qW-"QeD̈xӌZH% +,qd`q@1KhAy[qV=vθ^Q+ Ɓ35"^2XobAVJ0I̥7qnGw'.:&]mlg\3.H/縊2X yDJ`5dE#1̉QMB8Wa%X̸+x#>yw[]F޺U雚0:r eI~<GMb帘;SJaUxW籗4LPts\0sO'xv=.4` Q.LgM 5"yTN |kM T-`)]B/Kuhw~Dz;sQ'E3)m5> '/ NPDP ˅8/~|b:~ 80-ZÂGyF3|<}ݕs1K0"_jY"MCB 8 .ͻA3ͺRt ^uv4:aQ&Di) %j_Z҉['8P}+_Kc=!{@ن?Aj.O/kds0MMZW,-G'L.<tAuA˚S9֋xP-ZwJc<ǽOۣ&J/A=5srߞtcrAW؏%tl6,2i> ŷޝAmm/V]>.V&Nkλ>I҇;5X04+: ZckYMΚn[O= &2#mf͌63fFHifFHiy}$f gF̍i3#mf͌63fO<-&6{]63fFHi3#mf]=Ό63fFHi3#mf͌63fFҐi3#mf͌63fFHi3jgFHi3#mFHi3#mf͸i3#mf͌63fFگ'Q)!?A= tRc{:!Ǭg )J$-iڧ#nuu':x :b;[#P,sIi) FEkNDm"LD *}9&rh=As+:2,"1rbrX+"_֩/qnKCL݊)k^ַDu= N͹Gv1M( jv6j0-gǫ?qr>xRБH ) Pr1FJNR"rLU&l;ӏЮy|̻r79K<Ydzy_YkyU%{UE=Lq*KI<?|"GӉ:# (S܎)dDxDHs20|`0H1( ōAhiQdG d3g5a( ߨ^Kp3"'*P`*I S"!wK9 Q כ8Ce7 Ӑ'N6qǭI%n~LݲFUo״z[v!ox1:L85S!apY I+&# \!2o1q*8% 7FP =E4g$r>#O/ȓuFi_t 80r|ZϴcjKST[H/I VO5[-38&(ԄiR"Rz#J_+ebn8$g6l.dOR -tpi0~߄ۜ~o:էƧZxVٻNB.W D+Gb46ޚ\B;,6J/Gxcrf&S0I?Nla<=>8V>>jTX&^j{"WO$AbŠX$26UH2(Rغڰ>Ds>3Vg5`*/oq°ZRyY)db,}^4{߰Jڵʅc`CauXK1dƾKu:Ƭ=S b8OBl|>u_ I-bRtwşkwkjqѱjMnV=׭H<0 fQt2 ݇_:B ^=5wdPf l/fR_jR -Ybgjs=N?>z#7:Q%Cf^R au4p4 "z*?<>)Xd\yhL5מw1;N4'|ƣT09w<%0ެoý%A1,>r`\"/4r3x( 9,/׍9BUQ9c?ӾuFZT%?%Jq! Dx05OϧZ]ӓY]s,zA4S>^>p-BwoOfNmߖ=)x6t{u_Z> 3nN.AvrObg^.p-&d:|z~wdۂ,RE$頶^àLy'&5D5Giud>PQVX?sߍ$D{yo۽纥,Bn6R)So=|@$9\ ";,!i$)ޏ54Z?H{0o5tw=範w VWǀU;-Ct<3^}`16eSD'hX J1uf6 gva6<#]@`4,BktHZ\B\:d>:2cq֊*wNÇbbӋ^}}kW#`p0?,_* H`%KJnRP1u*EӦ?`+NR镣RESF]9<ohLts]#պE:Is>OF)s|q1nXz(7$%e$"_jY">i NRONI6;gCMڸ<{!nƏ\ߢrՆ"h> Ţzl(FxwF0ěkW< }36ojZd_W*kgZxɑ_ix{-M8lr \2;XQ %7OdekmHv}bՆBZ-HjD4Lt5CA=֖;pjRCT4wpbb=nѩ1-4C\͵55Җ(愞(O1hq,l˫85`!y%eA!zg$ .yr$DɑD TF Nf5x\Nτ> <\+aI@N5AK^b2r;UD;TEdqL9Yc̘rȍ.%:t 5*H8JBm6RZdnMRG&SyZaIb@+!c/u|9hV H3g"0#M qIah%a\;IcjʮhVUMn{̟3UWrlf/3gz3d9^չa֦Tԑ-Ɛ`X+W~x5LaJp`KTIv\7Fc1ٖWazcV̉&ܗ5MEHdƵ1"%㽉`$Tl8k`/:9R"M.c rUPX}G59c2Z&_Kh}a܇OR`ALah Iˢl<)Pg"*o tZAdxf6kp2w͚^.2Whc=)2Ifh򜫔 :@r,7)EeeS=,{i'AF X& Z qY;_f̃X:kΎzvt7U噱-ZlE& L"ZAD͟ ƃ.r@(*n^T"zy=I9DS|Ϭ\2k'B :$렼8[Ú=Sgn'^m9Cfs2&`jᜬ˲WD2 &_c' V<0> ڕ)<MQ%N$U)ɥI.Q&"IR(cbYPbCVdR: y6eoae2wm95.mxݐdN[KD!@C}ę:?癠ݻ"LNmUOG~U-֗y U*U0Z`jըE̢b=mxGb3Gd3Ga#4QO4I{N]0XL8(K)FJI+V(, }ɂTd.z'L931bg16,XOg7/dwz>¶3n'q}  !:*.TBx8+i0׽=sEcֆeNQr@eDIkrb=kKXJE1'IO`\_8s]^̵lxhėn;xʇFn.juy6/~0^>mM2(yYܠKl#V@X9Gϳ "}csGC;u. skHOJ#wgy\Y/@get.zuկ[n8EK2YncpR(ՉIyg翹.2R44RC%'`9! K`l$$RɯL*U a3(7?.ϗI`S%:;d7ޏ~m6.8 A*Ct8*I )D` ]Id,SKO.#S.(M f~(R)v֧tҞiXZjHB6s^r"9O{Y }.eh϶g(k9 H0T)# PdT"IDm[YTb>Gf-' [8 9y )Er$x׶s}Ts΁d N&;i^mm k%Ko${2ThsLR6"[ɋD< 9j|EpeQX|:աz_?wMSty益$ޭVImW׿~?WiR5#i1;~j<^]'/U'OJKNNkctNFøjUU/ow z:/{rSKk}?Eۗ)6SߊvnqHY Wm,+ҮEb ~>ŭjyHHT5+^qM>щdM5D&jӠe Ӹ^jG,˺fPdFqL@FU] u),.\^io0ЗZ $+=}ہjA¹òC5jbV5(͑aR}MipG[ )|ԙ&*i"n%^[.q!A =r̚;lRҲU 0ڲ_KK 1#͇t I@5O-?=’m߹y}kQ_kxf'-YA*˖}1(Z `rPEj@ғ;F$UH?~A/zs ?s՜]]RSĽ}sQ/>JΣ(KҷcgKuzDu^^{S׸Lf x!BOp7&C{n~wJ1al %%Ӻ A*2&< "$q ! )m6 Qj'jI$*x 1I& % <'`IuCh}r>PfEy1P`'@J%ޡL+< T%o3q <30ʇI(]XdY%,%ȼ.D[-V\K,`bNAI$nUjz F<*s\ǔĽ5OeF/1:IJD<=df-ioD,TutlFĔȯ׵еWxbMhmƞik ku(X;Z22ZEj4ԙI&Vυi:ͽ=r>ٌŐ!8fH(!O'"b=Y8〘2CrYJċܼ̉@D VaK?ZgLBd:E#4p\iY\.Pݗ+S뎞iW^M?|_KbUpX;ax5A5m_4br}_Պmf¤bX47di̥}7ڇ]o!JXL N8¬J=5ZZ/Ծatso̖.[6_Q 8H݋z0({ݕ4,mgX.G#.<ЎR=I]n&gCz`%o-49xCl;wp%7B4xQ)F]vzM72箍ڴwo s=lzHC0AFtMi8#lw l+5Һ{ۆo ?^y%FxCk%C9>"4!f⹝o}⅓Rؿ-fh.,ykCX- ?gNv<6%2bei"Zk}nR5~IF?ҟW4|us7Y qL>Gl[H]Siq@įprDxE >_˧ଙ;1N]iazU26P?Wp:56Rohl%mYnлֶ=TY›ŝ]}^ޭx57Kf2TyS5_ʮfukO?nAWҜh-ZkmHe/{EFqud7O1ErIʶbo4B$*S#ٌۘD(-%B' #qDJb(6="<"$YNsC  i"F!F$Rz8B~ y& E;n鴱G9βI-= طWJ~_l8n|f5@Põϫ7>u=xz#'k:.޼esqb2.Vפ]=k |8Z; /rq\:ji㪹NGõi6r$QN_n1׶_'"N..U|d.mO.Rcָ y|ՁA7n㵜3Ax|^WOjxvwߧL;b䫴6}㻔Ҍ}FlN⬹GGD@fvy32m68s12t8GJu/沔g);A60M6d HwD6udF_I\5%Hf~f>QQfvVHAY wbR9.VII|bO~hm ޓ>cVǯAcBsp\\;חO^zp`ip!X-t8> ,4@6{^Odc>=gL-$EePkit`ՠBD E*@&;ؙ݋m>_џp!{Us]t]<נwm[GwӖi˿{z$;Ǭ)FTnëߓAn3wB(ݛ p - +0f; U c~W逰 OjRz\Tknl{}SW-uҬ1m(FdB},M#HYޗtPW \ |wfv:cBkL'%ڔV8R:K("*huSo6”!Z]TjgP`њQ4N3A!b"Ǭ 2(JE eZ30{-#GhFs+ ٺeFɭGA݃Y٢Z0m e`(RGK]Ԇ dDF8ǜ)KM䁀jY50@XDg' qfAI$*Ű[)r5[wVp:K74Z3{z7rɮx |dbv9 nj2py;wZWwR[wS"E橈B3 2L҂h&)(HL \E'Sbbf YQ(֢4-HP\ěU`̙R1[wF|J6,&bȌTYa,.gܱ߭>YMݷ_b4ҌF|#68%IH ViJƊ TikQR [?^PHQk % 6N0+D{!CJGt`"@Fs7bد٤cWԖQ[{I<&.md#4 X %[u E%1oyȅ4*DbQA ('qJF#AudlݹH/M>vEDU="&A 鱤p%;,3f)-Z ou*>h(  6QA1pz "aD҄Ij$f.%eFlݹ+^c->uf]qQgEb9" ! N!=0'BD7i3^`Ň}lұ#g|~;qkF9$0UFQ%Ang?~G5aٳ[Hj埙x:KmYQz+\Ğ)^-KEJ y:$DrRD ќ+ 2(s Y KՕHXG"\JصA2 ǨU1kQRk VzqBnolDӯ0dJ0)0|ءw{{M$;ݵ&;ƑL-ZI:Ksj1Zo7VE 䱖[i%a &Uw%)dMkV|I")TD2'Tfil!ʓ@|x7C^tX i7fu/n# .(R(y5NX8Go^YLNO`{XiEw97Z8jFn0Tq0Ĥ*#8Q$vS)g+R^EiIEt"Pk=%Hi.$1Qx^u*[N-e;Gإߗ3 {HϢ {1zHrVK4,h1 1 I_ƽM-^ =ʑߜjhTs._(vOp{UY%}3O>D ` M Lb|`=\?T̑KvVWAUGcUգ¿/Vٵ 35Wiz ~ϼNtTt4|p1'oX)TLo6ק 78ϯN~OԚu]pg:D#](s#G&~Or]T^8O*A@F˹xN2H.TGl 6)Aē:r.`=mcm٨T;[0I$I'yeP 43ʼVSO}ĿEXxȼ`<[]/Tup㠣6#JZι1@EM*=p}|BL y_?]nd*4*ˮ0f( #A*v[g[i_ nFP{]RS"ӫ*_gE'- (t/Qo2L1:{HCMqJOuj(DS S!}\Y>mm*5vi_] ٨Ic1WC >([Qњs{&{._{K7xyx| V yQ ,]?N瑠HY.I~2ɹ?O3x} #J㩮w\BsQ=A^b`$0ڔce|":VrR Vt $(3W\ȑ0&Fňrzgcv&8\T8UۤUljScQ·V#cњsL#J?<L&"v:׫)~m8X];O/PNE@ 2DbBF[ƓC6+QפKL}:*Qa\ZGĐZFQACFNyl7Zsn7SԵLt^g`:aʺŃnIß}$_6/n>[ ],Wo#q׀Ǘx;H lKq$>OF`J_bFN;# CAx ƉNu;aj̕})@jM[(KΛ@CK e#2b3g \k=L r?Ioħp*7XSJM U=&G mB[_!j+?]i5Yak^~AkM1{wyҒ]CNŇtySM}h+k]kź>]:|wK< Dp WVS7Kzk#Rrd/IM{*S5P|"6llA3 |bMD%7f#g\omƢL{nXmyL,ͫ/*=X;{܎":d(tHƗݺ,x I9GemѺ1y`r6(..MUĥI ؐ3*^A lE#:6kzі_R 8vKjU0 .iJ4=X6Q2\FAƪ\ j\ƤjXQiȬ d|PiD!~MT!g`ڑlζ}l,↎*%."/@?#a&'c؂Us iK_:;G Mqu !B>W2j. jē&Dfj&K !e%Ndzg|lZŲ61\LcDZU%wGϤzpe(KgҬT/Y**FehZl ahdbM-P^b[o.r_#*aBUrV0 `$R%x WWy6lل d)Xh=ց#[ EXsMO$!P~-e0%R"m,e6YeŞ=(q2QwޞIq dbM !W E;ۋ^@d.y$~|T0:\#!6ԛ/AeY]I;jeԑ.93Q–T [T 0lEM*[0̔8C摧fضչTqAh/[ 1Ȏ5,:!EҬhȘudUPRziG0ud8ZsMv◷ (spϥsofykB _U=p]0'<#U4Ѹͨ9**^}%[DwcŸ>TX>WNT*%,7ܨ:"u6=Sg3Na K؞ ..+ OYdr1I*ڤ*`gLŧ6 :DEZ0tq΃) AzJ@c5OTJV&h]K)ZAFY~jvVE9t!{9z?tA]LxMiGv ]W?g/'46}h"%ݛN+ t?~{]w}N3~_}ҹ^<{W%^uޮݪ|75(曻>9 u'^U=5^ }vg* b/S7 O qǡnP?*܌,7?8*1m^V:Kf7|NZt*oD(H;YVL9[ x.M pg~$q6n._?8v9p_sc1}}7;<O4{XvA3r Ʒ7T9ߢ(έNUJC8*;i#cܷr(1D`P\ab;Mdk tnyZK1oO_=t3<`{UubY<0w&b _?F4_n^hŇCiݽ;{']W.e\*H[H9kvI8؏?f2h?b"ܺ2zk^c[j,+rN2*dER&)2TAW/2:ɺ{u&-E1D5 mLS@rLb][&*CBb|(o[_PЯ\hͪ*FU)XS:5S4d*qQV(BDvv5o9½A+Zo $1(ض!+ f.*aU W0t3L[dp5vMGuJ̟cTOޏ ];]i9h40{;z ι0汏؝ z* x7}N}Fx`)gߴno}4+ IwrCm•.iHP)]UA8t;$"l0Sܚ]^ox?.&<[CZ?ww9ntn:l;5;i=];O(1 zh[fɎVwwb!2|X֗oqwtT9GvZ믚IZ~dF٣70%J6Zz!z!z!z!z!z!z!z!z!l!z!zѵBB/$/k!u!z!z!z!z!z!z!z!z!zHJ'߷Yg\ψzn?(%BfQZSk+1%c}q7pL詑7#GS}ίŸ:mTVsKq,&lH(M3WDTFH$1F!H8!+\y'Dity+"X`T9F$㑅H>HԔ1тFQ4`pHg6wgָz:|.Jtv;.Dpvـ #qDJb(6,wb\+FZ丈Ga87)GB(.#ϜՄ(Mz[o lGxy(/= XH*u;xzR92FIP6GbFpSHH8D\1 Ab2dO׾@O/ҐI 3ᴌ z~HSpcuX|SDXF! dAd7JrT hp2rygZb1KnMV )L+|b!,5h}6?F+}M=a8$chߠdvW^IF p}4\B*$H>X% 8D .jV`0~ >s:BwgD ZJdDTl\[n껖%&ЏàzۏAGK9 bB'f.|\ҋ.i{=<&~Z[yc&tAc28EܖYuw7_.j2mix>i6t8l@.p3^E/`q^]dˌ'P?P57iɲ_t6Od>0)\nV.0ކ{{`yΰT/ǗEzr t q򯋔;Y%5+I?Q7VC0 wүʵ]TW~[&@ȩ JjjVF{Ce<՟Vw#Xzn$Ńj@>p-Z:eY|H^lp`Y,|PM۲lͶqA =˚?&1pa1-x_{ا:W=f6[jኧJ7 ~Cqtf0y+֯\}0j ݗ67\} ,@'i*h5joMW6!tόa?J’JޕʑCNF陴X5\|.awSObJ)>ͺ \7Eǃnɑ)}t?lc/ifL&?b&x>2wwz<11 <':s3ӕLglwĻ<.-gK3A% SYI|t`Ъ'NF "Gȣt t3Z9S ]VTW:FJ(13Bj#IU^ÖHEq60 |f:UDy"͉QėG%ʴ'9&7#S-<$FoX+A H  kN8 Z:zY@U&߉~Scd^ (qv oO*Ak.:G8@w!fPC 7xg"rFf"dyBK$&L*W﮻af[SwI pie'ZHˠEj`T!"fQrQL>?]pH8]t)Du j S[kBYm^B"l(l3՞ȻTOGV2S.x}oU7Ś9k˭#Ăk+Veu=7opJo2$UpB!v jg>NLj9+\D%ick=//]׿&@ g}\۞3twT̻n6MVb꽎1|c_f&,E}t"G@kL'5V8R;K("JU*O(]jo_ Y` ̂HB9UaaR"2+jWp5K74Bx|\rr,ߞgS9C2O?MUoϊTo"⩈B3 2L҂h&)`Q@N*-f YQ(֢ L$ @Oq(.}Ԗx93@*72f~dgdlq,X( >{Mu5"Y= [˛F|j|OAmFɇѰ F&GR!S'=%UjXĵJ; X- S ^!Y* 3 j-$& fh/#vHHL"Ȍنd7\SAlq,j̨- j vămR 1i Ƃ,% ڨM(*yrG.qVqC3  D9#`T:&=rTiLf<6x:z ODZ2#*Xq-ǒ82t8 Ϙ jZ`h(  7QA1pz "4`x҄Ij$f.2#bGĶͪ):Iɱ3.Xpqgy9U77dD(Szf;h>aX  .>. f#pa{ʻ.Kֶkh囖4y _$;яNF0IWfꈈcֳP |͔U 4iƈ[]_#qg+Q19QcOu$˜uRZJ5bQ*ZX'ha&RFyw8DŽRnMuOP( Ȣ`;X+\$J!RHrW)63 t}ON39S2w֞ cdC|V'2QLW21Cg71W:1tBTT頣!AۚqkM=Kɝ\R-UO6hu($bCNy'6srAt!B4m1-p:3uu+¥ĩ)!S&aJ#jQ` | \KXҬGY6ahU=O#HX\_Ht0g|H2Xh謻<HAHJsj1Zo7VE 䱖[ .3H0q_N ,z3!R7HitL*"HafXT<"$4"P  LNbpOV@p8ϱ_~Ee=DB1jJaESΨqE:[WSFqDWS:,M/4H 9F G-U-*K1$6GZ Rzτ#(R^E:$Ŋ碌`o: fѴ]u(xHC ,; A#L9+)(ԝ v鬙ޞ{J`a48W`M:gZ{8_!ed~TuW0Yǻv..p@E#o!R&%ZIlsx2^38*q}AnkuX eՉ:RFNbs_eP=μ_رJ_e?/gqxv?}S_p??oJEx 4ǚfm۸\veø4zȴ}ŧyE;z/ӕa  -~GҘꂎ:1tmiKh޽iĮMkQ'|vMC߯ .:քcmxo)xܹru4U< 7oyR,J9&]B2 Pr̋|ƹ, K"< Ҵ>!:R( d!в& U鑻s3Nj)8_] ;sؾOzIx]+Uqs##㔓PsB4` B3I.Yd+j@8D")؃Q#ꊇV` };ԍn>RmqĊmnJ{HH$!qro[ٝG4g#?΍F? 62r8!Uc d`h2*me6T`O=JV`"Y395,XÚMBj?D1C.Sp@_"( X -)싮)A"̩IgII("\κq$؜MN?}~&޸a V=X{Ym~azl}TA`D&,v6a(S&{ir!$YLqnK IJ`sI&%7jHZ6-#ܩ CԪx>vqܬK5gU;V0en+W~ h/4OZuL&Fs[zH B>61W>O>C8"͞`nS @aUK=s~0W\o՗k6xom&n&UH"dXcI,b?os.I3SgϜ =sl20ėn{s"&e ~잝slp5Q2ג 1BI7`-t-U.JG Td(jBn58:%Pd'qgkFO&֡PTT*hJUQgݹY7gǓiړկ[^}3%k+zf%;2KpVR̲M˿-[+>9`YHL $RM5(,t LE=S*yHͪ4q7@^h3 ܻ7)&vM[ZsYQ1qD 9NB`N1BArSTi}g t4)8d 9lI8?u0Q{F!CVzrBGPDcK>mmkA c&(t$YU4qZjL,,CD} GPc,,Q 61PQf(#j%yV)L,q'0'_'zM#'B3ٷ$%ghׯqGrY͌Ns7A\o1 P}96.J:aл6Orv˾|7aQ4v.gW0g"EQ2$Wjuv Sd71f⌖1C Y kt_u`(kPpB8#j ?;7cjCͭ]/ÒVj_2w`AO0[喇Fu>A2~MLMŽ>1:d.N8@t͚imFonz&SЪ%jFRa4J9$dRx36CS@VñW:gRk;XE;Uw,If) $LW#o9I YT],J8fs p߸GV+),f|m:VE|)x(4cX6Bo}Bkmmh))suV5ثGwcM<=/љV;\E'yXx-M{K39ԘlD @Y$&lbA{o`͇ɟRrAY-4nV1R(S֢/'$PNY4 JAqEɪ4;I}6qrY2-"8_۞;擇pćK"l FA 36>'%I4lPrvsLNGӒ-[.G7m n. b\l=y$~cߴ&tĦMST&  Pû˺2}=RfJ݋\* [u\ >u%Ӽ`B2D 붖16AOZ8S+^$" IbtQRvȠ4&Bb)05_P.κCn]'x4}_l9ѭ1Boz]1=^}$JLZEóiՕƏx^H.,z4H7uk_"]J<~{gOHjs1b+ Mb0*e6et;7㳫Vhe!];%@dZs}2+O%}@I-p }.Qw44,IxYGaHWmu߽~(ں_Auͬ!̂}*緮 ;3Y|I^NZh|V7j?.7xJig~p4#fKˉA^[m>U/ӏ7rGWpI$!N.:;M9NVhg sC[ޢ@T^Dc2ܺڡei"}kꕆV6/jkݎjmùԶF3X+鴹ذޢVܣ9bSX/ؐ\Y߽o1-[p? QrhyjNybvR˓UY` >ErLx Z 6BBCH*&jלdMm%m-Q1bd?{ Pڜ},.īCNdxf%I_w;dv[ZiZA''EJA)(mlA!T]#)f > Qcd,>d.$Sk(2gz8_1~ 0E`A]cA%`IQ-JՖ[&[_U}sT)$"FzeE{) w Lxƣ%DV4YV1+!hUGX^{( oR!TR2玡B] ޅK@#N{'L/ ur>:x! :Q9E4/Y"U-Id3<#KƜ 9X\|}1>5ys H/ZƩJj[+DC))IV:ѝCՒo_`sطJ0e>,QڎMѵ(%|'ԴIE[RHqu~J!u€"6k#裝)yXētROZ.E#OAQ ׄOQdO] m)RȘgE¥Xg](:ޞ6TeGݎ0H%G@~q SNiE +1ؽ*؀d@> `-iЮltێ8U& 'u$j΁eW+$C'˳ǚڈ.tt%o38BI!.tx5Z0+3\iM- V5zXa=Ҵ>P k9 Zr@ъ($z*IUJ'aROmH1)'-ljA\DAJJ#LeFŀv4խ+Sd|2&d]g MJt"T]zDzdgS dRȠE $C \ՠ6wVHTA2n@Ơ) uk)PNi+Z51K'M`&ĎZ^E):+ ̚` ,Ql7Vr Qt$-(D$*A65:% DD\7;KQ@;hjCr2WڎebEִؚߜ2Oݣq@)6VX)6>/GV֮{1/Ef̺(NcB*&6O[ipD9!"MȾOffغ3Pon%4LRZ'ս mzZ$ >:%‘ፍ@ӖtdYt4ҕjI#J2a1'8;k=ȗXP p΀4D. iՠ(}ڄLkptqc4@^BB&>yt_V 5ȤuՅ:vTl>zX:?)j@J4ԸYΊfG6Z/kI|ӷ"sp?xXi>:YHև\dPe}4 \{#BKzt)}noX>)WHB%| \:$ճR"Pc*H=V)~XC-D;)f brogcyAP @I&e#kvpqAso7Fu&M3v"/&J+12~R*qwq`DYUZU cfl~%dl%VmZbAnO+itҞEw64IxFDj3+jꭊ#ĻܬQc-A6@*tI6 |+T%La^ ڪmry㠝yym98ڇj_.N糶ιIjd0uvtrL'Knc+a0`Sӿ{{'QEŨն[SQ5(U/ 6"}Iarl1$IUfR ;( ,3UbFHz DeOkzD-ecW,Btʐˉ[QW6b]J ;XI'y/aLo("ZDgGMnƂGtv{PuJc-й'`ҕȪTq19Mk.,vFo -8֤U 6)j@%HI20Kfj)څҵ=Dv*UaLMVk LW F:8@ÈpDo7VP8i* HFWP F̀zreR^HO30%7aFF[9>X2'IB r8%-Cɵ+F([0x @\En!X7ϦrUq4D2_%T,<Ҭ$aE`$9D KO]q$D LBFt6b ];S9O^Y߯Ʈ ̑xt xD)|n{X-~9嗍߯4!Mjn! Ts:Hd%vf>=BT0pLJ G s}<%t(Y -*pc%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VU<&% x@0QZ^ D3ؖ@ߢ(V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+]%P.cREpE}[TY3X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@߮Jm^UR% RUX@d>J :V}J -d%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VR}>gO^/3ZjJ׷~vbM̮?67( 85ksoO>-FțE<0z|f&8fV߽qf-]a{Re>[p!lL/pz~{QP<_a46矖ɺ[9K W}vIWV\k6 8[ZUVI2# I/om&IM>" ^4N()Fy\rʒ3F)`QJ "RďYR蝂_=EC{Qcpm* 3.-H-9~{m@?`Rt@nh5{S8G3Fv[8R86*$c7}2rT#i88=ծ>ir^+m{x݆\ON ǢLQy1U"He BvMs.u_RhNdhW6ƞek2Il69i]mbKwF-wIiᵋz3~ v~~sZO>'mvnŃ6⋁xT6]l.cznOYب+iiw[ e}X?1'Xn6^VOzU\T[iv 5Vzs)ǜmD(*TF7<4ijB;Ig\f ny!`dZdIa[4oģ<?u&g/u>)fLNu(iPa *Sg7aҥ(r|vRLe2?]s[cw{cW4}3{|&OD׾-_5{uKN-$yѾB[jYjG >6)Pb d<Wy0K b~%b=OON/S-C:G k6ݠB!wvEdd>,< /ϲ ,NPt ԻTc)CJv2/6tMlS &g<>8 P[9@mklg^>XmXO^]?z~I8"Nj CW>,ƟK&Gٰњi~yXCa(u?~L@' d7RЬ^6#y @@MNh+:Zk"A&(|N{Ջ*JtQHZL }wB=9C1[_ThIޙ_-{1{鏊գs,p8䠲:D.&߅]I"# Z|A$+Gv5 ^&YF]#VƠLx%kP^ǵ.xC2Rx S*#t.Ry¶/?>|b*BzQHC 4+A̕a0Wa~-3tGsb3t6N s16SX^5j\u姟\^^ޮcc\2m=~֪WZؗqKEF:L!%ɮI,km#I =rH]r%?!@ƢaTD^mp{=ItkM#[i6=UW]Bw -bE7 iЦ X )(JtZ^TL-IlvUOk/־Fn[n+Ff6Fr$w8?2bڟ>JnhG0:4ݛM{qޝ 0@V{9{/Bkqُb4mCEtn {qХ.mwG_/Skߵ%-8#Ĩc:Ŝ2s:TT5h!zat` m!#2'-iOkSfN?hG (-ds 3|/7]YT*+|M Ylke1l2/b!/)ߋ|ehOY6VSH~X1 )#A7ާbMƐʲ%*I[re M6L  jmebgye(io&ΎaM&+{;$F#H)2nk[>)}zE"H.(#ZeJ18VCm蚊I@l }@ƈNAIdX+ \"eі)A l9p 8k kXXlDFBz:=qEz IƓr@l+h,/a,Cv֛նa6Fl3I.Es4)k2RZpQ6fd!LaN6¿&FzpDbx]OaSji;3NG3Z~mV}n = -10si~:MNPQ-?# $ /51|ZVͽ6PBo{MބHI 5"Ks$ 1I"${*L DLGxoV+4耴&vQTʫ{k ki;~LWDhBx~|ICe>ںylrá:)~\LPf2Y L]Žr=}JhEvzf_.(/.Y7m]*`+hv&sݡknv$FAJ Qply$]sG27 y ɓ"Lm J^iPHVd`|T US) Q0?NX0 E@Κ vW`fN)*_>:;Ȃ]fXVB'JQh7$5X%M ZwӟYER⣔YZ7RlۧP璣KdI ,Atu6'}!+[%HteK>$#AGAvF~XgMLeI]889]]_bV38ŬO#]鄬Ib.'ƫ(S"TMYjSw%ebR|\llח,f|b|Td_Ʒ쵃yeh4cX}?zc!񐩘LNؐ&Ύ<`OCfu`co1z=k gq2}#nMe5 @n@YmH7|&^Uf ϷڒbyN976AY^4iM@gm吣*g" u[((fDM" } G1|lY3 0Ϊ ,dVuFVvJ]֎i^w %Bi)bJd'ޛ4"7E"}'J+]7ĺЦh*;Jc9E"$lS"R&/$2>DaI]YƌM̽ jk-)sbT=[Ւo*LARQ6v@6v `L^uxhmY>:r/~S5m􅞬JudRI!`,Eh5f>V;=|-̡,) Ե6!lb A(71 ?2VQQx@lQ D1#JcH"r'-t2PvZgj&NȨ2@b)he Ĝ:yE tt+C|z%M7=.뿯D&M'q2vRRgt:¼LaK2&(~).W' ^k닓ːgkMA9'P?P>ߓxsl {`&[˲^-MxQN r>՛!T -̮+N8 W.S ,V{ L_YIt$\1YI1oIϗ xxPuê9I=P^R[<#.Jm\&R {}/nnW7| Zo6_y+Sҍ*e8] UMw Ef{׵w6w*=[zuh{Nm? u4e-}m;^⌮vNPmouu{,3w ܺKxYᶾ my𩷄֪Ef_ٱo[nf}HBCHꦰhUVqox8>L{јj$s'-*2`Iқ$g}QVdTc@篤o1NJOgc^e}6K( o(ir9.lL@I:b6ئKjD||)Mq!퟊  )~t}/g4x?i8wu_̓<>~}ӭ_ܛOMg8|=ָ+%\ƀ{y9=esl^]oOoQZ`…lpfuvG- /l,TlI1CvS Y.>EDETFh-ku`Y"y(k{~@&:]IHcT<$Iчdi1=ZkWg6ٸV$|k9^a, dLl챛ǎiC1k7ӎ}oGL8u XQ)%Dمxb#'Vvc9ɴJjv:R+"@fIg_I$HAr !bHLɇ)n?H}=0{9TxfGF8zĭ,`AzYLNHQD- J=H JBb)#"ȚB07&kD6E7T?sL`Jޚ %fx4npkCmS[/W)LK/G5}Pt4̆``3F ʹ {Sg_/L!Vڱ?㣅?T{){$ Uhٶ nv?vx!5RI|w`PlwF@^8E)쐼w89S3ՌJfDb>jM:(⡮F' E!d\q6hm4 ]uV*=ioGe/ xC";kcLje O+jz4E$Kșr@E qneZGEdQ0A[R.k%'RHH׻N]s :w(ԭR&U#ڳ<%ysԑϟN:s7H*ŰkIu?Obf&Di. 9_k}n%-ߟ/=W(~3{*kJ2BG")6KtBr)q;:!R maR/xd!R&BĈhA #(H8Hw\wΆ爦ָ|29+.l;ж fyޓa3tpeY~2=IR_lU ᶪYꪔy=?|"GӉ:# X)nG∔J m^8Ba1jEqqc"ZxTp+0YMn b"{ %{ J9$% XZYH鍣aظps3c({p5g"%4MoUs|^גK&hqsnvRǺkU|jO)oB^g8^0 FHN}" .KR!N b2@ebxaUMAYjž;,y|/jB](I [)E][vC{3ḾH_ꂉ']~ܨkx|l<;>&!{lN ȃjJoY ,.jggۜ~l-Gw="e2HFD6OS0 #òb"~7~바 m?`F&U~K^4Y6z\7 W[zc&@o1q"+BtkHyV唳 X:| ż:t2,3@pBL,z}axtlɸ0b\lܼ d8)_RLR¼M %qFlާ=Bc_"/ǯ॰ ϋTgx\}y_f%!9̕gϟg 6E#r*gOԪh|8g/p|Zև_o|X:q%D^C$rT~\E 8^YjoHVY vQ(#R~gt2 RW%>{6{e -1SL^oxq6ltJLx_/G &D@_5 M6j8/#fdͨԃ]!n^7x R)oYz&Qub%f[T]G2wcJ˗ o@C m$OS3;i<L6*jT}C>_ϙ=@fk&f=Pۤ7TTغ<&᧚\m :As:e6*qˠߌx7&A2Ф.l9۰b[N@ ޥKFrwp90W>)pֿ1*ÖHEq60 lf:%>۱P"uALJh\@!`0b?2 -. 94h8Vc{ KQ> dg?tfDgDP/y^NJ e#*}zc2t-N|<=X?lm T8y΢W9\b!(^pM;o,Dy-ޕ6[~+v8r~+=@w!fPC 7xg"rFf"Rk .[ixWyM7o#&.t2\Z`Ł qrZZ 1*DĞ;!2;É DN,߷M-ZYKߞj4ڴHFWR0|(~h ⡛5;9}$4; פ Z(O sfȵ1O>9PaLX]aTvv?ҭ3]H˧ft }giꤾ3O7F: lp6HFZ{1[,} *۪6b(6WKy}qص`M8}>E6vI^JJeI. <.ۖZI1\4 Gla [@xh4أU֢k]!hԠvi=%A ?5/. &cd;P# $ Zr;Af^6 BOeu0ZWnTlGQ3?"J*#T8USV:P`<ms151"5q"BTݩC!m$bCNyn\͹RH B^:ǐU!x!cZtʿJ$\.%N Oc*5(K)2%&zqYvΆ!Kn$H.RVfIBK@s*]$9tk)PeY.,@,_əLI؁$FQ 9y71Z"lU*@s \ yaM/w!\#)TD2'Tfi%1@!Ay;mD(@ܣP$b:e$#FI` ")❧Qㄅ;uMV8T+Q\ GGAYwM#6!$@XsιQKd6rK &Dp-*L8-!m X\M'SAT^ j@$1 1U[,aK^_Optxٚ+2}f 3 A#LMPH J1u=3S;:O{a^W  ,Nq ֤s_K WA[`BnPzA¢P0ґ4BY>pMLHY}|Wr~`~:9X̑KJpF˕=DC? +A\*1ɏم1խyrMkuUyp]L|q4<:]qUz9C'A[{bNۺ!h* )]T1.yԸʑ$5=;sx:_38@^ fmUWu\ᡡ1qT/ 'ǀ{ӻ^/ 0Q^}Ro0L۸^PZCdz(M|_ÙKϚ ̕a :!Yi$_:1r.\?o5]u 5]S6Z.G]O&.2Χڝ(!S:Jqܸn'{MƱm7c'y Dt*8hp^p\8Li7Bkf`bDIphK^Rϟbg{fœD@=fk匰`@ڔ`K&)SJ`yJ~_}:yNVCw'tZhe 9(urΏ]ASix`,(3xn23Ł`( zMg(g;N&rJ焏@4&H5 [k"ŨN*0aa(FS+;Փ-:٠"CrYnL^#V󯆮) BMY(QڑP;=`9XȥvGfDoގ,ڙ"j(MX-Q,Ъ+ИGQJJ.hX[PoRȟ; Vm>)[ @|W;mKlK@ge |ЦIHaSbnћR-<Ug\,SVo7wjbZ*bCיkEU[),Ne^ h ftiv5kA?]czՇ/c<`ǙC:<N,4VuU&~[Wmyc달6dQ$eE]ȏrpuσ9]Y(SބqOOPǷK#Li~ib66IsDW.+ -Đw~1Hc4z+Fh%].gl AKI* U5ԧ-/6[wUgR|z0ٝfX;o`[~Kr~߾~q7h9_h|^h)"}(& ,}*x+wj"\(zm0ܻlU^!XW5iyD½o~ZH & AK!T@$,-JVmeQ1pMv:W|a(mU&g3*.$VdqBE75|6OWRP- g+Qih+!9|ęqYKd%*^ݱuܶqsMͷ.>^cmUeuɸj4جU{5øBn-why `$P^?1B8Úm]yjMIyb!Ѻ(wVydyڥ!eJ W1`7{b l B,6-:dgި(o *|z֛lҾAsw2,|Ҁˌ `Qx͂VL" KErlKk)`s%pRk,*P5z08.8OIGN YV1+!dcevi+QѤ*CdBm ޅKUV{cv_:GBF?f"Gt򮝽?J]>"UړIb31'B&ԫ2q_O͹}ݜ4ARVE8USI{k%hh%% oc9eNJlI'}Rt0d%pMNԵHMIb,Q$\ @L3+ -D; D{j]j};4#H2S _26XAX3zVٽ*؀d@= `-iЮPsM# g $jɭjxĪuUJ\,`,S,8քFnu+yX$Yg0Ь2v_WoT\s j +a =4VX4o:tZ{΂ *P*JvtZ2ɷQ#LI2e,΋؎yB݂n!\"UTԬ4XfT l`W@s|8sǼXd"zbJ˲f@57J֠@碑 uڀb(4VH(,Doen;clQȌ>B*<  褱Y7ka;"p`L ӌ/%RB`YLu>% ic-N `I. pqDVZ hަV J8 92zܤ`5ofY#b %ؼ(NR&D_ >ìr \;M>o;DV+&{@\!iRUFB1L8fփ|YΈ A9JIRM -)2LfeZCP[~Oy 1}Y%@ Vx 38`Gp>cHp+KP-"VwV4<¶MVqֵV D v*"}XfDZvw+di"YrB UF-VXB(;KSvz R{YyH!%:.p /spIgmDT2{Ex(%g6ܖS5Y$aNJ@AG赈 (-f)c &4 ޲A+}Ƃ@Ppg"B7e#kvqAص6F!tf-35)ٕ`?Aj8޸Zl?(c%!uf10Mh%QJ"-A)5.-=zv,>i,@6@*tE6 |+%La ڪmryX?h綾|iyph$Uk53IV n=nNNi`$ti#ll% f Ilfu6Lk*ꅶfL-'ٮ `LLy}0gް[ 3 @pݐ -ȡmM1Wts\@7"2Uw_[hgM@Z4t AP`AH hT]ڢ)RCc뛷T] Ս[W6b]᚛$r,Z7y/ ap`R'J-*FjQDFb1;$zޡi,&WTg @+AU~19Mk s0H@ZpYPAlRԞK󬛗(X6#TK.dPuN^NϬ}~{P׮BTtvDlz;k*@`(ic&X©:lti =V&Eυ4S{.r%#p{*̚:FX2\“FܪH頉׈ LCic Yc6ՔF݌KC$`PrAw£jI|Ӯk0+G}VJ2(ހxL5:/Mֺqm6mzȝ}TN|z}{Uhb@kw{"sc%+ʍ X )°@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X *_@(7J COZb%ЗrJ V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%Зr`f@hti6%5zo@^<{%@_(Y J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@@ I $(`V{N>{% N/Q 4>fb%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V}9JK냳.?.h)z\^6ׯ.պ<_47n)/$枟}ZӴy]3LE}ymqA:X/_Jۋ䰜8V޶n8]6XM9pv?m\WY.NGbLHCjl6k}~a`6WE5l~\Jp|竃wquVv9땽Q|HgBEʒUS(΢[3zMEjOr<^&94i``E T?WQxp֋+erמ\zqT䮋k܂@!zt*k4XuI}rl^u^|O>bz|t?ʻc\/鿾}1t.rLp 2u%m0!|F*$ͶO/nӼ}ɫ7lzJڣm' Wp' 8c/g iN#avmע2f] y0)$Heg٧xvԻt8+*=]Ne#0N?sZ{7C$}dRx'GBs9XE7;š^nLڲ«um٠F`}|֤Em`\50ɝ>;-~I+ȯ q`qc 8=ƻBv^Co ^3}vLpe΍#iހ]<+ߦ~lnYj,%v_n7 fvrrg/ ۲FY-t\7) yk}vMxPT{AWYɹvqnp4~*~O#.j-fjcnYHay}Զ.pttG|LJWWk{p=?Z_njl_v[nاe狱Hyj髋z^/\P:~?D6FOoo Nm4[u! z|N7^Opl'Ru?FEv(!P'm8=X{>`l hK|?n \ed9VSrwiM״q׽2֚kݎ̢6|7I~$,Քitgk%0>}\z6gz.=M_C+h8\@!qCBi_`UFJVETх/en2w5qx7}nwe|Ewe0W{r8,QuMͫf<۳륇c lGeJ4$Z( _zOsN͹w9qޒ}@7.{\w.?ya[5KK-VߺF7P.<2QȾFgbkEPeϺHw\)0mSHme[0yD- 3&l<)ן=p&"Ӌоdw|i3_=;u0*V*Dq-[YBihʺӳ٩qWo?{ƍ v*#XI֩J6bgTJ\SC.I)2"ECe]`?t7RMg;2Uaa1 0Bg[\d)-o6f*fww;0{ *;OލG˴1bPbjԔ$DC,"*ó3V"PQxJ-9m5jaElyBY(feFs ϕEPP#$&L5rYˆ]l:M갡b j}Q[Fmգv/E(x@ADZDhK2FHCy YLإs$\H(Ţx$(띖T! &DE= I"ZGema<,6x(q6P}싈0"{Dr J+$7Hu 1@IQHzo}QD,Q "vD|wU;y2!:Qɾh qŻ(yw@PO !+*dI&#!|ȅDq!pPu쉇%x뼽efj |ӳ>0ל^/}8#^;!nNp#y?fT~ d;|2Ew, : "V HvQg;4HgzW/Grup\KyWGFD;b*yֺ\k')O~8k 8B%( ){$%eKQD(ITSS O5Zks 1 e9VԩtΈ> ƖR﯀EŎ#V3|j0%qx:OXD^>zM E)%Ͷu>ZVߥ@_/aI7+f>lXOl @7\i &1ũt1ג1.'>ę7+(I4 P x@ L"&N &=DfVHu 6SX7)6ðf Mzu HŪ7cW3l6}*'}>{ji?~bV&r@ yv{R4րal[2G˭rKH& K43ʂIb ) ;I. "&>b"IUl(BL"sr2;GR9KVPK1Mg;0.nGNG ޭ+};GIޟmg4wב'c3»U;UxmA)%YJ΂[b!&!YvV,hIExd8 =tBO<e1Rި$Q˄#`^+&MG!>Syn8=GF'()%#U;ƽ&ڠb9F (^ 6%8ncŸҰ~HLf*ZtG ! C&u/f7r' c mt Q'M\|0BHP%dp%!\Ԓ'Q$reM(B\2>#լ;!'m# ]R`0Xe^3ّk8ϟ-U&Nc[82y7i$<%k=L>V󸸘LRoB9^8?\xXί)J9=@$G#\:Z*:gv1_VAX y#+PYG]e%Qa!97C \K XT"Aeā =XoSˆWPV(\ ̘z0]Wor7+FW%B|ΈerB#G.d'.U:eM ft 0A)hb/׽hmur?s.q7V+]tIWV/em]Pɜ[2gMT@(Dըv "+)S'E'^iҵπHrbBƌ SF$)J "SekSSm֔6`& ?ygL^! Y]ԥCd5@ wQS'M]fi׹hgUj4»TP) & ͱI:R`QI~- TG| @EM%$DuUbUu()*5C0ae/$vϒ*;㒖brW}<>2gey脕-7jTW\g櫳xƿ`<ɀb=X`=pk`p۸ w@^Z'NmEIt="ԶTXHn\؛`mZԆjRqέI2hSfBD,KUUBRZf 6e@B±rs8%xߕEO+ $#)&G5 {A60@sd~Q[L*Ec$ppwxᬶq.#@& \P`H(]tv<_gwg2n)M1,/tV"[J7uĹPDewb ()"͂D5Pb'Jn Vo:^ v=l]6q)L)mJ3X4V$mF+JrDlw?ݗB=u\V9TkGd֝/D'p;JKw4$xQn`=? ?5i] ߹qK1\K2A3Is4[R!LT^qe>ȝ Q#usYtWzQ;&-cփJswwRC.8GҐwD9?t.F410^o9Aj0)""冀|%Qfa= \nVwKB ~1=q"^"g ݉#QF()9:U9܍?Gql $Z|Y>yʕ'$:cT(oDAo=dO3ݓLEn\.p9XN3Bu&~Lm$*~mx8<^OWJK%rٌ#S+r#΍GUG^^q[H\R zpփel^_3/;/.g-U֓9GuNϖknHeJ׋;ׂw$#w ú1sMbysz`d*za1G9*#wzȮQWuMq#y`ds9C$ %di+cS&~]v6~4=Cr۫_|/ë'Ͼ{yB>y/7\E]O4F~ z͓qʽVoGs?lst_+f> &[saG///-3CO-eq\VS ˸P;B˥硁2~@o89ޙdlvIAulQsͣ qJ&-u$ڠ^dc Rf52DOmpXKpvbz渦%NP<<-8T ]ΞdWZ$ercS\砢Ŋӏtɇ]{uUwy`wA1+dY+rޅr_+bF QGѻ6?N-}c|n:DM4qnL&UF"3P <}:GQQNyJ A:egr#@;CaL( N ',(N8Wj:Źh=/Ԛ`oK,~Bd~F*;M^E@Y[RgQ6{!oh w9wU8~u%ۯTҨcHx8Tq"*\'""(YTb! 9R-cKgZxȑ_id3&+`ؙ 6`Yij~ŖZ5vVc[(6YE~_XU2Rifř!!3Ѳ֜vUREPKs!y-fIylO`<m Y2]WM"6XgTlBE>ⴢGl)j[5g3yw`wBZ{;k[0u6OF\v N~:3UOMLc1:kF V<'4O:+<<'¯Wց,hF=NA$K,J 9U|ÃYyX#жτR \%/J d:\yaS8WP6֜Y耀Gd9o5lC l!!`}-&{3AbU35N8JF}7HHF4%$2EC@$ŅAklxbda-U=Lj`0]`u5jbD.X6Hn$Ҕ>&E4 $יq({xrV$f#<Zƈ5gǨc^ȦWbf77'k.|)x|tro<-a6>$L͜ Bo<3J[eYv,Z4>WL^%Z]s+'8P[9KUw-w}j%Z.¶3VףKKsLzb +v̵c\:\69c|`"P|E*\[LX ."qB~ROs}u!bƽE#n?vvVEͦg2;hkQFٗb֠zUb10+)Osp,sjǽ?w0Jx!M2km6'nj\-*U߭5~ g=Ӽ\_-7pdn|&fɍ}f9Ƥo g01h)!g"zm:&'&!̂DkSozVP7ՇUqO6Mi?.e^h 藇QfUTDŽ0zkvPo4%Aop׶G'_R68^Ht 9U\FZ{WSVrP)R!9YlCIpq>p ?j`mGaHfOz|FKW4YkI&tg@ٚSҝR.Yk+Pؖ"Y %'"Ӊ&`HcBVZk),Sk:2#kϦ~)KNn\őEڒ׃4r&Ie!.w^.Rf"0ZUes6II"8JY9 Z{B!%6ZT0g[V֚Ga2NYIY )8ANE} Ggft*&qͳu( uE`€eʂ<^5nV#fZmd MVƔ- 5ٶ3 Fhx3FoM@-9˗g׷g:3洦˷`A 80IPz=EbJΗJ?#PLlSKd9&l2SHk9%s^MEˢ2rF,[t&%NZenAJK&| {`R"1ʼn9P^-֚s=Wyfs%O >gQrf b`)X5Rk3WqmrGMXSQPCg'/giqjU'`rX!~A'0 &Q]KC YӠ^! B* o8F8@"v:g̳~U~o|.ۋɿny>i NMjT.k^?Α{ERIiŽ7W@!$Wp*t͵4Ý RlxB1녏5~,-E\'(( }1yB,Yy}o% \k>s_6i٭NQhuXwR~߿/߽NV}WMfZjW_o}u J_NJ5T#sP9؛g8^ k.)w 7I{WPĊ)8\WN'/$ )z#Bp40F )R2kU<]W)4)uvȭ-۪"!P&è;!J39ʂKj5{4%ˇI(^XCdiB4[lH!Qc[nJf)Rm#IDnUj;q:&U@X6Т EN,JOAƠ*ڲ̝ ۝x[_;:߁^ؖnkͭ1֘9QJ5 zRz?&nrT>៳A<ާy'}skp6?š {WȨ?; U^gi<קz0\bϝPzaԯ6żZJ$G[%jAǥdn$ 65T}2Uh-v \(kƛj*kYlʭz?ToUsKǛŝlU Cxsu1eƃI7Vy[g\m+Θ7핏Q#V4u4^bcNmcJ$r$&3\q$&GR6F+r!ł8)h$`$ZT.9v)փlٞ]B?z+LjN`%RSx5+8MS9M쫓Nj)ki S"D:\̤8+YSk8t? ُ|6\8M`/WϷY[~e=7Tգ,{MB2<ӣ~j&ﯵ>Vӕ bN3^c424Sȵ,=vzf 껅c^bX1Rj>nJEXg/?N^vw {SYʪ,ףHi)pTZH[8= eӉIgwWA\"~k4[Yb5:ÇwmqYW>`h`z恑WԒ,'l.,c",V̌<'*#k:XE"/K֖xM[ 2F|rWZ> {k OlJOTVl4Dx'r[s?7#i2alOO|{GHE6ʇ;n $Ouv}wY)Yޯۇ|9r\y璘LIFEFnUZK4$m d诸Vfko2jJ߸a4ߖF/ ;YN%Zw/ǐɢO7RE&'RF⼶l22 Ib7V דKrAHE ItYkV.TT.Z/"gA~lNu4 ;{eh)x)';gsj1Zt!hj KKRںfQQT,N`R: JZc[`aRVd7:eZI`t$)SՇS}g%L~ &hB"n6LFhRhR**RP!mHzr@cVKk޵,lM~%|m \@!{Fzr}a7=1iŋwi& |ReҖtH֡[VdHXlWs\ֆdϪ,U%X3F"ωR,mtY$7;JyF:Q29om]9` v|VYb9kOeFM*\ J{+NWB37F\r(6` :`֞B6tX%$xs$v@$2`49 ) \Bm xH v+ ,*T0:Y-\ ԃshQN3oݬ0`{,X56һH2}Pty27q,jBT#e5tƅ_l`H,ƺ8R?SIS}S]}FPN`q(Jb֢R_ }V>{JC.0(JGAET8/bn` y zRt&L&dX$A  2l`Ǒ$Ü"Y!y/(R_]N`V (_XYUD1 V)2z]C lPAޠ|ͩ~B2`Ɛ)ea( VH(,Lhih$mfLV"˄:}n*A(d}Y' sGPAA.&S%sJ ;3X@t`VʤCD8wBPܷcF5BQz 7,T h֞%ᐼPP dT$(Sڌ&b焟L]Fk힕ud$EW 1R)~U{A %i $E4$VsA PeH nweXs6CB |[8Md3u&<np)YbHQBu"wj$%rB &">Q.5++罥i[vqG݅LМ6d-̤AƫE G^b| [ptdw%ׁіS,;;)<ޢY7q_4k jITE"eLԴB*cA>8M>).LKc4o{*ċEK*z`:G͠ ۀ8 dA*X?<{UygD% 5Yr&貚 NҧU1ڽ7m诏AARP @r!ZʹH͠1 l`Ubl :iMD0}V %2<+ Iߔ,cn:XT̚$ff"X(ga'D pUgD(!cȡhlͥV 4]V;,vNj$$RC]#v-A*RIQRp{o 2߬rD Hj/Vp5U )^6M0ڭ<⦗:_ ӵjN*ڮLk5FQ7.w(kFh%GBBH09ECɤyDMEƬT-G(QZt}nm2& |wWӣaҷfU({8n𠗨5&%TP͡2xy؇pn uRX ΨT`= e % AS'+SnC<n\ GqQ];fGS>,ʋc6_\֗|7 kBoQˀ𦼳o0-a$Y'媱ok~7\u0}~Ct|]߾/tH-ʳq" cW7ke3g+5̈~*$WǦ(* EPqT猲x'jCqF0U2gh+0З"d-KQZ))ƪg(舡zv.C )l 'tZR- yU RN-ɺ:Wç,l) j~ׅ&Ȭ4P vfiC7'DHrE,:0#qL g:YsZ6BYgJtԔ7UlMC [=JZ?\yȉzrE|}[;"5ň~xsWO;Q/ߵ_np&cSB QjVAsVQVLpi^޿-A[BOr)CK5{ #8[W>W^X\~00m1㛏N-ɽ{Mwk'Ֆh[u(d5L23p0_oS}S@Iw&(':b/'$^y^nrȺ9w(uU5g5[fss֝&U86ccw\|Q֞K"g[( {YwjJ^sV9fBs = Cqy;_r"f 8Ң<4nQ<<,^?h}777E5?Ӽ¥ VG|֎ٌlQG{}Ᾰza4g`: !;dqrv9_~X,p|^l7C٫? ^8B6G˞pNl^ ~F9h:7n<ѣ!hqP~&1.|:Skl?g4n)V /tξێN;3cDifL,\.I:8÷yjMJ1`-)1;7,E> UOn86%,YoQ| ϗc56P<z7&l=YtSqc[ i!|qy B]>nN`xS(f-N+Tp]PBӎ7g=kuKb |tRX)~ LY;fC~J@<}Tw̪'I: 8Zw]sSMʘ1Zdv?#OB$SD[qA,}o9HOZ<SIO /bpЅ=@V|5,::!6p3lV=y COMch**kQ/OߤwWjDy@j>Z$-D0̖0]sۓiςߗqb~e<{涍 _wi6i&Nrgڸ]`!ӦDd˙H Hk% B[${Ξ<.7L+"%"D߱i5@]#i~0&IBO>Ě7_r p!" EAU 7:/ZIf9nXSނmd C"BI-_x %AdT-*+%b TQԽBT O ɚEYINo Z *f釺]4:JǓSV]v/"⑗22'qr0F5_&.c-/?Vf]e8kf&2s`SG%f_Kϛ5 |p BjZwR?Gp{oTY K dLpnAjQV~߂!fƵP&%^Jv ֡2FΠfp_^Kd:WM)M:u}aӬ+=׮^g\":;Ho ]zA&utN)_ˢ-owt咒,ˍ02B2TY9wJKIVf&7ZDz=jU ź]]3%|YN޿ޘ)|xny8X<,&j &'D" h3o )1qQ~"=bUVDg_lxZE!Y3 N+4XJJ["ZdNY%Eܹ{YuPS9#+GBm݋~W6iWH|ā>p)O0 @TZi gcGA10* FBp?^5{#؃ PMM7g Y!&E!pI^/^ʼtp>+$E,:2tox`wλuע,%>+l&~vuR1NZDNwD>sRuu"H=d$CڙdgsJ5p2R1cDF>U( >V s/xLP伬&PzxI~0N_͊v೭/5bo8K`C!_^<4&MH&8a( K\ҨO u3?=~$;o覭r!5.XxZD3TP[i^ rG]i}p~IY <&c|Uq*Tc)V)%'\)&!U 0ܯm\鋓ޛM_aؓߜCUe`]c&vδAL`ٰ_z_,OlqV`ObY/ o n=tH7Ngqlq>Lax4BQDQ;sh4 rlcG<q1V-*aҷ-gt9yh5xV"@M"u*ISXx5a,hQʮGAu4-$xC:$ԕIȍ"Rq8{ FV wiAcj#o:m%p[(UKɔdG _e͊`MhJ!{>R'^֠DVvM3w=@x5( z@F $0* Ѕ4>3Xw,/@xmQ6=w7:Cߋ N_2 S1@Ԏ/q>{>_ I;L0CTZ(cRH)$8֖cVrr4Ù,3y9CWý|X\WrHb7:-: bt8"_ijſ̊=ϻM?g{,zGdDPr(oPRKžC-h A|wbRV 4 a5BTzv?[|L<1W&͹TPccL͍T`$`3B+cID&/v(ywL掅G7{t ];S0rir2, nX.3Ze58 L8UR#i0wn?;4$;qa |6aUȭҠxЎSap [6OMd$Gcje4#Y ؍#v5bj)VsPIEy&9r lɍe*Dsu!P*883ϵQ8D5w439)869ʄu$HX`>ax ,:L1Ǖ}P==bF:!l)BZ/ q  J)J^H=H{ZwROP,D̓KB{ugWHqMH)?J8k"}\^|(yr3o8TN^Fʨ?91j񸲯oMmtCT Bͷ-1w$i'%RJ E*LPp@`+ig «Xrs N#:2}yE "YEǶF-W2UW@b-ȵ*8>E+t"FlRZl8KplCu ׌7P̼El,w#.j! %)uQQ YCȆZj'Fa8om<`Vzg/n@+z,)?Ph\/Kx'/o'Ɔ[jtn|[u_  kA7^LעK(8wcGI5.U᭔*jiHmJb !eSiJcvchYZ<N%jY `!Yäͭ!ښQE =NZ\0ݖ=aZG_r+H>H}8p-цH= Mh40qlcA*T o{T3OKݢ YM#wOe[k#t{yNhIT4dA3.Za5iarϏlr2lQX,$i6ABdd3yA%4U'%`3۫B6 \mB8 u|$ź=j:z4* كil G/$ ="mzI)~em+[o̱^s"ɊG5Dj1Z1a~$$XcD#m*+dNwec!x¤UjW >ڲdzBD)oL/I#1d{lҫ֨u-Rj.ivcB)VBBn߷qmDf8Y,zjӴ /5tBec'}+{a6ꭥa;#5$)SL&Tx}TR^~z7 X7FY1*JMi2axX c&*& PΡ2y|< ;5AI AI="[tsC]Σ+)"%l둇ҏdRQogpzoJ() +eȬ0MЃpo2|d"qRDFD-B.b]N:v8-=Facrc|/,n7VUs+)]gIJL5$7dhی#9q\=/`ImڀHXà_٠X{74RS/3edXєexqBM1#ivm &?oo>d@j21jA""[^w^8Xhnony'm+hɸULta= ih᠇Jt8*d~WX=?  s͠]h1" ^ubz&~A `M~QgD,1Ψw'Hs^w$EDy^4)qOj5]:۟~^{XwDXqޘw= >'X]xibEwe\`e )gސ֚,VĊ"B*W|!ڷFt+ΎJ d?K':ϫɑS.rBfWN$Zl]i{6- j;>=tJsCH&X.rY$ 9ys _Y̓[gju !bɕ؍o,\I(l;ĸ.ՔyJ5bEL>yas`&PKq1= $5r5觱P65tsEqnY% FZ}8 WGSÒ!;̲Cmn3k,4[А4sIh ]4gs(@ nYuL ~+B6р5ҎR7Jɜ;{S_z BszW\q(0V=Izo$P9;caQ~ҩ8$`\o\T᩾̋4$`!2K ɕ<§hpQ8C+COn@s|WLF^V\649 HLy3fMG{ѡ g%p+ҁK&#)ޑ77=HDY6dxۢ])+0AϣfY9@^foRc9U}}F5kni?H uRK9%曛\;bIZʻ֥@Ԗ<$bF27J H9$/*y%@4ʙY?;4cʋUYYm5I]U_E͊Z"=< j ZaA U#Mzƺ5ΐʧw#5YpZoo9haGAH:< hMh|faH=CgqyHjdy (ͯ9ѹ u?䈅 3ݵL8G`,qqݯ&?.g?O%"VAc;0`Fe(}vUQ[7an}"y ط>SljKf6>?_L,Cu+4M_<) ˜(=7['^ïTj|)S_d²t-I':!|.FS24yFtd@ǹQcaxR&"^f2xkH%PT: SzV?nTg@LiB!?$>01YR KRJ Kn4P}$* ;\$Ʈl@y,pbڑS2ɀA #W+EJVFjك4kkC IζB^=vMuxǧMe!y-{KTcus8tae}ue ZhCx5"эYm0pn[hC&xgs`8=PtdErw{L+oUR,ǘC;8g7wkܤյxM⛗R⫸xn5uK5$}{AA|)j)jSο mȫ//6͌LU墠$vB +u/E@g.^ck׋_cOò?=\!DϟNza~Qs˿N^yUME?gxr!im6* h 3Hzi!25(eNRFwWPiJm%m' eQv.Xo:UhY|y?HFYi `_P(ϧr Wu.%\չf.Ce+"ύYN2QZ2+6*!W7EӲJCq_/?Q'b8^VӉTrU&jNvb0dp*;HԋG"j6)1; 7U@s;PlGsGF_ U!|IhlG24u}a|qS4ojRvn׿:z&dq?|_k WϽI?ErE*QuKY.,!{9Q0Zi4Wʞrvg8[nnsNΖ!w0x')iJEi\*Ua 8(2fBOozՋrU{ sc3 1Jm›w[ X~γk[zĀWBޱ&eLq?3 )9o&n0(yxlNWN^ՕWuU]9yլlfPUk(Re=JK̭%[xݴBfdLQ;,D~p֏"V֏ّ^\ȹhm`72dQ;ۀ`%TTzgw/V7nmd5ETԺd'U??ތ&/g䏋RU?şd2^֨yFxɾLS%_1~pM^JF=.yeb{?~1-f 7ku_ssd\GOuZO}7I}Rx"-x߆DJ1%oWuU]/|U _5녛D΄4 3JI{o|G+IP>Z4wҫa!˴TXrw8[U]1zUW^W͊&wПmTcBQtn4)Ls,=J| L㕞$!ېM] !N8{~au}asU<94Wt~D%7 "G!Ҩu=ee2D3b; (=Ald Ӱ}c8-EGJKsx7- @&2E!bGjٹdFMNJ)^aHD][u/b<[9ԳTV:OsB9i6r. w $(N$ޮ̶/ّ҃kª+ñ7Ol+ywRTz9aG cŀО64+I:-$!e1lݨ %l:pl(ڱHc'Ss}v^r ?K f L\T޿g|W:~@lk]L1S6>DW1f;}="g;a9Lsp r'AQy7aDp\j}Ua:_v$&86 f@ ]DlO lq`0;ߵFy Lf{B;++M$Iէ}X*fGfm8m0PjF$i"&.]738& (ш2hx%a-]Є;'M䰟62֬VFδSiPn?4x~tT/#1ǵm΍JlYW1 d4`Y:| XL6:&ť_jrJЀ% J-Кz, Er`hN1\VrN- >q7Z~:"E$7n!2*֓ja&p_wm2f#]3팟ŗgr/_?~oK=qKݯ2D g3ΉB&F!7}9je)^8%sWXkH>dlD+}P5Rwv8AxhIQUS\11s \ q^ҭ !D )y0L"&7rOZ+XI./awz;YjqFhӫdlNT;e}FKFC& $%l(,uBMH1qpyJV4pƭp (\Ì+1\J##iad[L]߈Y4FF?i\]T 04+EBS K-M^LǥL4cJI0NjżP;BCIuy>hC ^rtH4?6("> ((C'Z`A M)OuY" FrrYr$c"ʗ֨`SbG@*8X; J>7sQL%f.pd+>C+PCfî柢(^ MCГ[^J;z   ( 8iTHDK>x!sn"K )IO1{Y FF GE|oG1+F{JwJ2U +'c5G]@&̮u\GQ !4pԥ9~YΪx99',lwHnCVejû/u!Y L''7p&8I ()j%4BŨ|~VU6]V*xfC>WwiO\-u#O(UߍYɖ51)kUDx,à)9m"K&@<ٯ.]wo{@1'>-<EQ̘FE>*(ZA|s}Z7$;4|,hVGT~F8#H@y|c.!f?g_W?S Rs׫w_9`^-t6kbL烒Ҷ~͵~ֵcjZE}HzXYZV|iJ>Uۯ/Q$_ + zQ>hRTwL`v~}Am7~ʠAFF?&kPӋQG֢ g<_+"+Rtqc,ŽFƇxztiF)H$JKqwQšED)^6YjJdނp7 eTJ|OJk9Y2ֵɨ.wO oN |N."uU@.ҒN WHr%/Riw\l0㑞>[xv+@]^ݫe#s0~!exԴsm%Qt!_om8A~$*+izJoh@a{" Sn9nP)#+FFlQj__Կ[PJ enLMOI9qőˇ醣FVU.#_jTaˊM e:O[>r7>)]C'?ދo/\IF;EbCg&ݬF<;a6>69/S#OY3uiYynѬ<⣽JGLS6 G>l3P θHL= qMم@ S]*f/nJȩw', XJHo7~E.6HjT3BD2DS@`m$%y)6=/[< Xeꍧ4ߎ DظvaK$8+-89@A'JHOrvV^NXn@άo\ -?_ ,k HL5+lf\& $#f0ZKG{C?b3mTt,ra6Q.#iԓ[]K&M[k D|%9#Ox0CC!-K H% rc6'V%a S.w ^1G Z1H~ZF\ےP&~ÕSވ3Uk Jm!H;#>eXX`}H7GݢߝeѿgB #DJZ#g@j h:Sv )AⶌcFGÍk~/n'Ngo"I,Lę4HLS+*;*s-DUĚ;hEx3Sk~Dz?imj  Y} '% ns"OH`-L 3zG/-6P(˄,aegrJËӔz 0'S C((\CVzhp}=t&OP%+GSkx@T;9Eq0.8WHle"Ph! T`o_l5"-UVʑR ԎZ%F?l|v9;e_(RMH yvh"^-ZDEJcEzq2L0 rT' ${͟X#]a#4q)9 >2$ b/G:D6&&fz A ZFmHG"m؂7,F!xo6MX\-2GB PaV)a6uz d}y9K1 EdQdS;fPM#"OϦb#"1 wwu=xj-\NY7Rp1PP }J!=j("1LpYm* m:! qpX[L!/N{Tc5xlie2>i[SG]nч*K7~ԋ¸DPX\ǑOQiQ%QAv7% `貖ĕ˸ͬYQ!H <Xsk4Q6EzW ^ߗ"RSIq/ 'Cf幑3̑F)~Yť ^i{ \PΖkm' g^S>'x1`䕴*7DƟR0D7a-'⒈Loؖ61+#Α4qvڝ5Wք#l"X((UXoNj1|\(2E1%4NP=Jڭ]S@_Sʭ9>Z$=5՚J^lT)t~-[/vY:ՋVkJu`0 WR:۶* 6Z홁=& Q9$A06N"V^! 7r4LʄagQejh1&-視 g0]Ö8&  uX)$DZb]*tG+l_O?U׃u5]^5 ocŠ jmZd6\&*29h6[bhI:vDϝZpi|Ar%㐪g2Y;lQ1fDY o1,,-8{;G{)ܻcj%޴pM< >UP#{*#+5JDo(PuXLSKC8 fe1T: vz[YNyE"/}]3 }zvp 9#E(~ M\SȚ/aY?x wLꒄF%PԾ^ ѢgIgޯ_W97/XPN+nWLbW#͹dgUӎ.APk;Ѯ.\0#Fr8V+U 3XHFǛE!<1"Qac>zgdUVk:myhySTSIaC6ϊ8"JJi.05u;D4Ոk"jכS!R-&GyuCIUz(|# a3b>1l_w?̌ŤpgP観qW]1xMMe _-`v؀cLNx+ݝ?&8͑ W &]-xRQ.,Zj!#t;8nd`= r)'[9_x@*KpUHQe,ʽ%O鯅4J?9Ѣh(>uHLy뫣pID֪]V~8jVgĻ򳽺it\8O>DsD^Sяۘ;a@/'BG}*#: ?1WAri@ jCgȦiZıQI,C:v~mR(irEH<ז0{*ewOQ[( XDҖ!o(j>HdgT_79eHb nFԐ,1Ǜ>fim8PnԴVQv;4 X\flR; h*VuQ_8s2p[dS}e\l,O1lN@"'BGH?6i4ݠm7s]l QLhEА@1Jx%n=D1o-Δ tydI(]@ep^u5Unb ZiH 1N`!SX A ':ǧOUi,Jtnʤ>BQ{Ww %31X~H3$C:+.j1lU)le^XMϝZ .GA/kW;p VvFp'N[BW]>'wܤI{%w:XwO H^}:S77axW-~v=7M;R=ry鑕<*W9IJ$Ѐ||>v^6r\` ٔB }+t|m䀕n1KRKb7X8^HR pZq*ˤP.MJh$_BmK:u4pUnh'+\d9톘UZl%yp$/%R $ωIKwi$),FF#*F:*F\m Rr:SjG1ݏCLʶc6";tyK b/hLZ|L6vʫ%qίsB>Dp0; ɛbƒ{`׳:N)nRڜ1P5|v+&RH[F`+tbe.=S %+<7Wz\.J9`IXfˌ#k-F Kʲkhrэ7LV_rzL]B9Jhav>wFNeVhx-"1#ʲdbۆYl[}=%H|$}{wLMMv  >K CpDӺDw9jk amD344z %p݊e~Sѿŋ\]54H_xpIVbq0d8,7'Izp)es!I:se'GQd V`+쌫ůuh3Ȃgtk͓x2x:{"MA.hԆhhmڞR2-(JmQ&dk^?$[xj5mX_]G~3-C?O_LVdY VMixZ9XiӇ-tq"pg[9AʙZYC_'e'xnG#e#c6=ɨ$"ArI3|N=+ñWE[;)3$ i%b2T"cVvi>ցF0O)`{1lF&rsQ^F סeIFб[ic բ9Hq!F.IQהrGĪZƆ\MJ(8Fe "̥.5Yf>H8ե[rkI;<}z8|U e}~bxvL!o V\kXOE&PShaiug[M0+P>/M@1V 4=̎ntK׷/ZOʍU>À_$}Yj (D[PR ĊqH $Z#U FYV`chGƒqȥCQ'59|H68B4>~ԏz, βzC-/bnFbFjb:|"S~\ u!r|hUڍ&jT"PyU L' ?~M%%&Ku-bdFiעcS,^2mv{ymymvWv}cto-VV۵Zͣl}~8 ":QE;A4Kl'|]Zٙ-AG!Uh^Ӧ!a AfpK%CYխe̲wWuF`?yMJCp% Ņ!dmOis }R^n.q,@PGA1}!0xMmlbtKWFHcgˆ}~'VboF?=pk WdǼ,}C72z:lfX-kMj<k챶M-a#%t{k:zwi).Y6],wB wh#+,E41%07]*D<G~q E,Y }}HhUezX&2Fs-8D oīx2/7Ŗ\q$˸U/ص?tdyi,0+*%}NWeMavn&V.BGMYo"琽 ѳQm]hxY"W3Ox,ETόM |^q e47`봽$kȞ-}OB|OpK|罌)6$nLakl&=*_Vy8ϙ}!ݕkVZڳr75\q0k} ̼ROMƹ=TIZE8YI,$uу$ҴJX-lғ;Gh~5 ]^FGzXi$ٻ֞F,W,>Mk6 7^%ZUZӭOV#66Ph4 8üT.͈q\wvu.5xg:s/ږܷ5{)8V\ v_ F+O~ =wy6$7[{y"vV0itQD_OE 9:BZbKI5mzh)|F0:-HRF_ubV:ތBAہgXƻ7SحVWJfuhY8";=mM5On$ ;+~m 2޲*ݺhHjyħov =,꼱{{)- uYRރK[9{mi#Ƥz MT&:Ӭ֩Yp}&:X纪o˻ E̔},uzN0ay7g^MQ5!"J><`:FjLE鼊;Mmhl~,9k3%;?ܽ_x MRTwXHly1u>tfuI~Pwb[nAT4{~-)xևpV0w>eSE;k6oN>hʟog Dk7J}!;FF-Pv @ߐip;#=k B[>_gKJש:u/HPJd2zP_M2 `ҌÞ"iHYշ=ٍ/8w+GIϳR6E/Í a:ϴl|-ӒY=~llY9+ @pib!BsI 6wtF|u1׋So>nҷ@ 4p3#Mx }ʏ Ye -[$ 2JƳS4J4]k0HizAc!)WxXM88̊qpS] か88 Nb0q\ӧ J;EƗ[ɒqTgU PBQ[%Bm t=)BzAVNS2+Fp^U_SˑR˱AA[lG؛ `.dXOf6hjKr}xO 9ZCq,ϯpv~I%"e\Z@D[p 1FC.54xqx:ǯ7_`ʨ./= `u!p^&Cib)Tyf3 "/a\aǪn"{ ld[ ub3|59W, {oPܪ]ro>=x^-'\`AA w#GL9僿R3_9]*%_?#4c )~|NOrϱ .߫s5/ckf)'d|;9#qiPi,nfǟ~ZT|W1Nౡ_kx0?!ipZG(_>ttՇ[!AσX.bx¸"f\oKq-8t&8w9Y7"ms% UN#8VHV_`?/~4O> vTAtѠhg3eW}Fr^i29 ~Tzaoh;kKv8hUv @xA,U3ae4Ȩ^t >.EwۃҶ @%ix ђ80+E@YԤA 8wS'єa{qA% t+YӑQh YU=_Y1<»TE{ 1&>A 2JiL8k3gvR$m (x =ICc)27Okh٨u.PiVɧ 4S oq۬yBJ!PQ$ote9)~&}#]73y#=_gҐd>)Y \(aw\Z-Hdg|;Rm>FZ^ã^e"= A-N!^YPjar܉L;O@eX%7歌-9Mfmfʨ}-2w = "#G7"SXtB!\F]ʸupQ O0C_B)ˊt>_|N`.Y:Ԧ}ƐBVA1]Vn^Ep/p 65IE .uPU]9"6h[f`L9)cK!HZUqeI!]Az;chLlV/}Nn0t4͕`óa:PiDCZ M/$#3Rm{8zofj[_-gk[gcwE)VR%_ . }6fb*o~޼e\kNu$l-] pRS1 4Vxt%iqg£?R-(W^ T9- 6hw[p-pX^Js%~ea@ 6M\8\%Q{\6 N0 Ss}VQWҽa~-B\)d 2z&E8D}O.CR{3Ρ$O j !ZNݨ 7rC(}nJNىjLaRkθUqRGb2}h5hKz7 'F&} nԂ"A&e'|׳TXɅ5;17PbeF{\I$MϾ5L N"}ʟc9= ߾~Nϧ>~Yo>?RM#1Lr_PX{!Ъ3Ĉ e(тy&B/RqSNV}"}JUr󦒢/_x&lT:ߑ W_{akb(k:hD;.AY+am?&? wjsӆ?\ Uk=Q(;cD=V !ʀ-iԀJZcy/3"Jc8[BI U@T,+%z57ږ<ƣ5fDe8[$ij[޴ğ3buy vҶ4й@lqW"g{ܸ (j }Ka;8b#妐VLrAC}mmro) _8 㕅UYLhktsY.8ipܣ F--)pܸ8K~(xLl4KS[zA-v]6X"pKj6 tdRh0SZ :<(3̸"wrhv[h|k ㊂å^qKFUWUOzR Z; UrUĎZYqiGDm/8$U%3CXA[wGJq$vuբ ,JKBQ_ehDVLAF2z4[9Hچܣ.v $TFĨf}a0 !XO[q/N6R1xs Շ~]"!UFjQ Tl xttp)FsŹH]ƒ.iYfp6䇦2Ln :X%N9 -elt6hdrq |$sU-3g}t59jAyXҫq, PUx ]!AƝ:=ɰ޴j?'m?\t "[9)2-=m';Ts[zMb,A]2nB|F0:-RJaKJ@&hf͈-h`,ox2=.Y_{2aS2Y㈅ЛY@3UO&d2VaU  sm졼Y{VLy ;JQ"/x6Żw4B>BG^P^̊)A7>5ڗ rPޠ߀2sC;9(锃5SLkqf ;SެbCocU*έպ}G3qEo&ɴ@eLk@X UJRޏA id)`I)AͪO3ePN'}h*//5! }>s(?ŸEe -bE-?=A"wXGHNr2Z˯7Ȕ^5n?|=>gf|V^1V^+*l$|8DqwgGs+a pJnRB[%ಘ_y;i0t[Z)+{ݗi,Rmn!*8/rP`Twh,Oy]<2 &p9߷0S< ̬}f({[EA 1v*j>J~!&7MU5Nk9>Խ,nHTo_UH3īMo9raeU\>L0~a*ʊHlaa*gۢYT;mW鷅VReI! K@,…-aT 6a 5hr6_[?{WVJ/ i%Q;8b`4m8d})i49NGERF$)GRL'&4Xo'`.`hɈB)u.7`3TICK fCzw0T*@ ɧb x7'-&0覝%e*2&̳9V8vimcrO_xtCUg1q_ /T*vuyfcүln7ĵ*]"Ngo e[O5qyɃ Б;ܔ#{xLw_5j~xڣ3>&ڣ0[ WRkDu3Ӟl>̈́}(n& 7aNT*d@!k0l7ٴM;}bi'=:;M@/ޘv8f gFB&0ކ9/ͦlm72sn!0 DL؈yn6f+'ѷRpվWp󦣻Υ^OUg!Oe:ŕ/B^<6ڪ.O'~˧|qŅ{sEW<}hSPx?//S#{*¸Yoe7|O/:ɑ'VO>!gOTۋތ聄덫k /Uz˧ Ykޔ~O'Ru˧wRxBy=w%=G8T{fztT̞ahQ L6'G=c"GNG>ILCg7́,'F;}Ny=_]`Yܮ$ό _IZُCz$tZ7JlduZjKB>[c9羰 T_Rd f93ـobjuڣ' ˴<PC v0ӅڣңgM}"1M#Őn9WѤg>>;">3Vy&|{YD;;h7ε vdK&xZR7EPZH`cb}OP=ڒG_ 4j Elj|H5$W%3{Gkҥ o;$vpA1$QZ"Sm'oMRbU*DrvݴXZSΓA.\v7=hGzRZxDp^%xȲ[d[4[r.^8du RIErJd*COyS5G \,=]X lv*[OYRʸJn]IbdE5!]fpTCc_'+TQu<M~rCF4"s߽vM ' PF''Xv*O}`s5) %djJFH-¶ⳢU& I.1"aЛIxp0L^LlEl-.]$zਲd/%3 Qi]JٌIXKMdjZ@6v!`KqUD&$n[Lq D?(q#o8"3)t*WSt!O\N8( F[c-!XUX ;IĐAU t+MSzPƇĉ+c-4˜Sդ3pZ1&١Z (`Td :Q MWF[T#c1PqDɀ5]+Gd:aRw:fOAnZ$_)\^p| >792 WpO%VysЀ凣Ԥiwgo߿OOz};S^ fU0b^}:/:,(kah5Sy_W^܃sŸ<>D6*V~\ -:c>CF{}ncT&H!ORJtobV$$)1 k/ޘÎO'n<<%)ۗ& vd0؟%.81{;L%គV,?׊'nn1dpAw =eJI>u'K1: ¥9&c ZS"1BطS.=P;;qGr6>7`* Q#"/ 8VHu@nՔM59G, s"׾@jSo^]c!m~bU^̟n= 8{o. J ^>űӉ:Wp 1i?W)"xL^A^doZgנz-7{jWW<烳Fzܰ8{'] =0y;.q"0UK$;0Ng~O88v7ˈ3T}~|A>g~9>F TbN[9x92g_E>o#-{b/NT*6Xvhqfn -;},;#&̳hő̦l}<ot}œH)%i:+x"T}eE~ІVEH ?LG?dyPv˾kͫ_j˦(H)Yh~' wR삉8S(M<46EV&H3]~p|՛-ߝ:ۛ߸tʫ K.R޴GR9x>} /&W@ ԗk@`V*Zj C^x̼H:)9SփYC&}lsfYFS;RϑtF=֢;64:Qfw=q@YoP̨TqQT|B<:3[[[n,3l@.ѻ5•t :K,!eyPǔsz;4~PV#Ϸ}1"^x24{K "W~/ΌOGM{\קR@Lzl^mVB0rR{`.8rQ|!D=!M>) ;xn7=rѯGL7˨C[^V$Ffc;{g/=7{lrM"e [Yx &[d=6!ʮDbd[8 [KQZc+mz9D7N\Aӏ7Xi%fC(X6f3㼃.{چMxt4JtXhmˮk C6R,g̊ἱRKՃ=z){ n q5JX R;ߪQ="{t4$?~KmcSqDZlF yhtzdȱNϒ ;MY( ڝt%IJe{ob*B U | ^CI hHG,{ u{WO2(%?t=-d~7xL/Bt8ޭKs(|Tb8Z!FzTZjΒ8Hbw7G@tKe>PG9^D{}ƒ^_EϜ-|,z [フ^[;g㭛O?l뭿x~VK<;}}~+6g8CCxDymAϠgO1NǩD;9~R ^@znO, ~j@} "bk^Hۅ(@|Y FK3?1:1cbG9w6XyqaO_]8TMRN'H&VC8X8޴ı*pw{6^fCO^NF^KR^{|4JNj,,Nq%z71^ei zb=` ƣC{0~tQMdW>g#zXhNp>nbʩ=N:szN406)x>uUt[4./>}٫-Ups^PDbBsҜRKKbj]:(?sI{.@Gr Wz6@RMM ziBS*R7@He),$4U1c{%7]ď%~D/#KLv(ܛsU`I*Kܣd!ov/Yc׹'wibxyժÙ"_<~$/ťIծZ#}Jk/V\XZ[?z֎/ |GTſw?ώ篧|Wgg8_eU]uwϹBrGxҦ'5Q"W"}Ћg?&qg>z}yy1_~ nO?ųrO]gǼzV}MmW8S7?=U$$}|G3,Gao$e3O M nD+LEI,;FVʲ*2j.9S o$(HoFۑEt0 SE/*SzI\pN">p <!l8u;C!l~Hp>D5#Zgyp*(:'{igȳT_ 9uwjMp[Y chK;7@:gN+;Nux#sM2o3,j3$H- dj"0Ư7bD *6鬭jIJ*;S%E*Jif˫',2saL\qZp.k1=+7~|%Kx 3~]ۯ>Gnfm츁๹ӟ|ln$Mxd!l+6/Neu l0ViX 0](^KXLW lrJ0C{v >e ʹ25 kuB]g H.ɀ+̭1kYMV$4Jtl`Xw]|,? A9mS4";"ԵE2vo&Qeut ZnQZQ6jK䃃bcU04`T̠22%U/.ʪs jGWr%l HǝNc Cl1ݔDo7%z)nj 0 |ص}2IJvm"-C P6m;ԛ3kvsrY>k[FX26uؓm#bOj[\Q$oJ?ħIN1f!^jܬOWn@ />u% `#28馹n1 ^ ܝadnct—|0W&/}iDo"ze -R@d{Tl-avD0S(YҶ<<A6T(XVYÛ4Eq τM kQpdke\BlE[ Iu #Xž_[w %}7|2#1!y-;igtۿz}.[B^i[HQЪ;*w%åmi-4#^xsuv[AE_= (w}\R*N ϰJu0;-e jߊEfW})Sc#,LT `O t< s*)]jFT[1TeU)SCS0{ӫ˫_ke׋xYߨ8_w]MJ"$AE*$%hpo Jℶ*eZ`ͶSlZq! aq'jN> >cRƮܙ(dj1uxVD^X[u9䬯 ?.P7*n={~ݬފV>a +==CZJ vL@CVr}SGx9D3,__'I2*[u+RU9#Y7ْIѴkK'VѰkʚi O;;N?e˖+JWrq@c.sb?aK eQQ2Sqgy>IOR9w+_I*j6zœkb0_Z<)SwжE:{ʹ+~D:vN@-f{ 7sVv-yƄ+m0&qFyߏ鰿5-|f˄M&}j4; %JTo]OsZ{@9YP1H<EmJǞ#{5^8X=^S@l sSJ ?Zo-VkyOQ%j|Kt^$mܯ sхM*DrZ>E=[O%.@]SZP`=( PQ**''Qhj6&ņM|Xp_RUC ;WMl\Rκ$њLDֻ,[:!ԳZИk+>b-~K9< YFHqYrNn2FEYPEM|O7`ңҠM* Z.`Y'79jb)Cd<8"j_ A'l6!ZfC6aed/ht57 `E'j j)Y9g}$-YNy>]v3pZF"V;@n>E7y8?'$ O, O/<bM054ޒe ;%|@{U;%,=Jϓ%-5杒]7'=h0}c~z%uzXy9;pٽoGrlD6Pqj[`.Zy9Y.Qܙ 6|Ф^nv]_0^,'HogSI[~'6[o~gLp {߱KҕzM?;ގZFm>~+k\<&=xv*o;2ˇ>iPOb}>ihuo{]>et;.Յ ?} fqY5-!0o= #{MC?}p6hl]JřiEyHwsm gqYqH}7oЏnvG5>W}I[z/g7OEx|3ibFG730'2ͷ϶|0Hd9 Uo֏>ïkDP9z #z:==]cvF#5IwWr+:%2O) JkKo2ZLdpi!ɐ8v}Q'a2>pr-vJ/4ZD_άW3=ޱ~ ';8 jyZ;,̲ϒk.zWdLV끤CWL[p~e?rhnv\4G.Ig~h zzY=sy<-`=׬Ccz#~-|~CID TjߖLbrEhdel.N`T#@k91P3ZKS &rth2^軳/M3&7+^{sk\ypF&E<-T׻ 񥶟;szHP'L~{#8mԜ[mkO;>  T EBfFb@ ! 抺kALMuSe>E|*̽+M)ol "b^zjWWV O(S~"1e&c *ЯeԆ hSoFxAx^> FeV t) ,\EdGRmZl0;'| ,ԅA*$ޅ Et"d\J!Nҧǘ=h-׫˾=ʫsb)х/6-U9jMlD:2h*g}Nqlr(pڄTgCM{1XmҡAhԄ{!4F0<=vp`0a"9ʖHn>j4ں 297>&/>{H!jĂ%g fe\+p aTtͤYdP^D({=MՊ+œe\YYj:jmC] ӑ@QH \(ד,2Ԕ^9 ]JFM{e =Oe,bT4\kMTyPZD%RZ("LI`%}kίo^@f̤e X lCQAEBNUӲs}l[uϖԨC-]GℼB1Z}?D`WJ۾0`oo}Kb%iZULhU7tb&)tSNȝCX"@,j}j[=4TS (`qK3hup d QXrns+h`caB֋e" /K.b-Ĕo61 rB\9욳Yt $kux!#)XU#&wAwyj=_;y%\[\(pضfDqkQ1\՚3=*<[ŐQt8O!/.֦ITBKBZj6Tal~WԱ{ LqԶEဲ5Y8s5ȺԞ/bdT[G@yk)<"8]WމOV{ENJ(B H%cQZ #o5Q%<i@Fd%cV*a{#0 W@|z4 >ifmT)UdXS@UQ_IFQ}grKҸ#CN/N1t$ptHpyZҳĆqpvX[ gΜ2m'Z"UMbyFP1zP=gqDA ~oB2Mp1JE|k}7 pDCa% %Zu[ɫMn4\p/lq-<7R*ńty$j-u\o3#jgk#s˷IØg#e  /JXH݃d3FR&qS1sEL*Y7҃y@73Ÿ)7ǃWAS##!|[^~kL+ʷ.싦(%>܏A¯O@ <%5w^I+w!SރsF.lv@ÒN\z Vx% y"@8( u$" A ~1YָXDq%̕OL>H*ju0",ER.Fl-(YL-!Mޒ \W]t}6HSs 'Ìm_k4My?rz7Z;:uPi X>Jd3ZqY5VwToT*qV V ?hK֬?^qjiƁ|6ij0Mc$f|45kܤ X)ۃSPFpA<\Sch4Ai~4  B[;ؽjw\Oo#;7 9Gc8%_w;vӚ%|}?iglai.?;p4ٷ.a))lm $4':Ww:kJOqjTm 쐴|rh1!vY r{eW+7nW lZ$e`+8ṿr8(R+>!X/>gvWף8Scɇ3|>pzu_]| at1.>V_-J8oŭlJ4B~K,׫TPr=;SAh#$ EWR? J M>skg-nr~Z>%X $#Y^ ]8(ƷۨHLlrz>mɫ)4 vZ८ 0K7\ }Z 2L5d$opV 5QF5+]\{)f'z:K:>ݷu%g19|?k@q^nH fNB0c;. n?k{L9ʐ&0ɔEIȤ&pq3\ V beΟS*cޛ8 G(z𣙆ءە+-d%;~3t6<4r BE/ ]֮ XIX{!vqhlxv꼰҆ul9vnȹ 'xWz?&26ͬC=vư)8> De?s~+#]K3\m}^,^cV-|q\}fאַ:B=i\|rl,mxbNp5`!]}v.,몘BJ5VTŠ"qC0@}FLSLb3/*EV %=TkQ_X]aӎMdq2;zH_L9VeV9(0aR WJ#B"Ä["(wLD(26 Cect(H#Aٶx_ZbE*E ~1jtVhLXrSDW :潪q-hf6kkkS9/عhOΦВMi>NۍG>Wm~kL5UP̾h,pc+ؗt3{qSȃu?nC#sa0]m P.=˖yW͖$䅋hdh~m9Ӌŧ@GQr5_U)Ƒ8B@D$ "+*y2Dqc| . *:o %N31А*fBRJ\h4J&±RsAAN.0:+*MIogₒsF99j @yp4`叙niܧx_,UnBJ[oY%/tbiZ"jJʖ6.9y혫<t<2 CU?.UvȖŁ0FB 9&c#BV6îy#V)^V쁂LB%N$D%#3pE!nv+cB{ۡ7,-~ 6_)@3 KՆHE  4/6|4RDPu-xt[ l!(p&_,FE'=Ku!͒PkƾC1ϤP&pJ%dE1U;dKx5JbHS.Ai;tw/r*,֑?'ItR(!Rǫ>>ԏhш+Ĉ$ X#-η㤭w^W5mNGM08TNO[X1#Š {2^@bcbi$qd88hL?}sdǴih juRwy x;fE4XWw^1"z݌_lfx;Qn6Xg)z{JoxJgQECTE[2-UO";kY˅&91mӾ3J`:Pe6aa_ c.xM91Ndg)9 |Y  mA@NJA0Nx<3m O?߳Yw>/Ѭa06)2 SקZˋэ:vw/H&Nu Nn+bcoc$% pZJ:T᠈`m_^FdȮY >$r _ )/i3 Kg꺴~,EAyCP&H:lHF+ôyUj(wh N4!-;cE4X;+uE>#(:ʌV('bG)Bݰrρ|wuq.SuxuDbk{q(5C+rʙ9Mp{q^\oІu^S4# $oM#7 c}ezh'tNNJ 3JG Y {9]Ǚ c_I Bgd/ 0Iֱw.$Q7zwُdiK(icĘoWO_M^zOK!J sErxCNVPclo{'6 %F1MN\p2UfءYٝve(]$T3T^"њ9r1ď f*m,.Hĩn5"  yh+eaǝC*Y( )c*L$ '"mRVi- ;"4W _ɺOGt0DYMJ;:|GFf:.}W{9+UZqW FzB* z0:ZCĘK"ەs v&CuH>+w&>g \[M4EΓӛ# ͓8< M9WmV$^w `Ss$[/x 0*'b,Ri?GE2JT  愭JAҚIާgW}5~s>qM륽Y|Ћ?=:&18<`h.Cc\ZotoGOr ߼K*1ԧ :gOU1 EjJ9Zkg6xbZ5:z2ڈTRCiq<K1QATZhZڜ7́Q#3yL}2<%2;n&ZbeSE!p J eRPœuxBxC<eʷucձdE[HRbuJyx' cBJJ±-%SQFh\?aS0o-#\-8bk@"KC 3ȢRO@`#vZ 땂; jՋ\Ջ\b(W1 J,8 HɋK9RN9Xy2XNT+ s3a\q*bqF2hK0 - ZPi 4>*Gn&ci2T$@̈Xu% a*|;~WYg~2Qf%i{?jS0q>8 dM}1S'Of=D 2e{机"?dA?Lqѝ0O*l;8eˍ[ 4_,3`8J@HoOcEZk@U =lW֪ c0n܏[kVW7΅z* tJ֪ e@#T=Zk(f|Z 1@X;~pq>yâ&Opt*iIKT5-ח`U^M\e\bZv_z29[#IEgߚW^+둦IȊo2MT _8\{2P\(.hŋ5G>뚔B9o #Gθ$WE*\-h +BbVf Zí5R`)_kceիX x{턕柚O/0쎿-ο8Y]|_fO8jj۔YKvN#[d4c`mfMXkɷ靓q2#V(Ts޼yVe"QRq`RgrɌ5cp,f x.1ÂoѭU2,jDSp[;r軟6QCAZa_tˌ,3)XՀON}˅:`?<ƴ\g|:Sa:ax۹OJJ@}Ux(e-5l}t8r{V}ؚSXR0Zx RCTkJN*F_olu47m 6uuiD e%[-&bͶ կpom\|1?}5w+R'4 QKqŮ:Xe@HUNd2RoTJT]I+!8c׻ܪH:bѝsr7YJhkfm';2U㒪Fj&XAd`bD`cE"i|E )mf7Ddw-T w/ Ϳjӣn$A*-:  *!8:J4"wZg q8)]孰V۶\ A ʂ>Xxa1K|yݳ;Nk@N˒- !L59JO<&quR:4/>QS<\p@L Yr&c])<&xH-vywtZ]oA>zSw2)*qߏ^r\J}bD#IR6mntRɩ*zkڈT2s uwI4WA EғoF|0G)2EZn)Jg(PlPp@{+*S+VRU U!Njm)-ɷR°J DhSFeDP s R kd[q26稓1y)9 $}ҁ_Eڋ+N}"{gi$vs%wB pT@,u3/ (2JϬu]t*띺ewQ!K=c7 8364QړoBp%wzȬp!۳a$f;sŸPj˖@~YJP%ɐ0yJF0@d li69dkK!}WTE/="qtHSd;HvAv[$^Sq8'Ua|FF+*SAczz ,^$閯ZT"}Gaw1hR)=!{\ՎgHE%۱AwL*dn N7Pv}>T9 >,ga;pʹP9d!^S0 CrS u/kᓑ5!+4\  )WrX`m08<춀-I 4)"K`x3`LXS,Q>EaJ(WH:Ek ޤ>25˰X{2ŴJ)6S$J*ab}{jKЋK+=!NMP,L|8dߩ2ei@ZxQS[wZںo &m y֨Y&ҹ"du%b2qj@)kr8٨޵m$"-0!\,"81$fa4ͱb=&ƱwjR#QQS|Qg$Iv f%$ vTC؝rTT;F*U+#J`\NKͺs#&iGjN&Ed8\ːC8yh=|k)еxrH32 ĚP}LD@ZAoUz˄`ć]dk+;Ň c0Fᓜ - 1nxijXpaeZI5BXc][@RU0(HMHg/=^> h=Q!)tb2syivv7g"D*R A4N mlNM!I )>Ûm0مtKogU)x2׷װLn-cnHk)#E1HʼnS(FO^<{*;!Ok4;r+SH[fcQ̙?zptؾ[}֕@`b*Т Ml?<ý}D&М47%x5Oum y^%:+-lT)aRd=7gpRıF)c씒>1:g;LJr%y%KAHC%Lݱ R B#;y扩"GD2CnE!F*b ' PdR&53HLMbT,f5Eؚ 07)cc2H֙=wƢ"qJ$aZ4N6r0R'p<1PT 9)M:uBj2"(Srj9h@䠐vSL#XaJ:D0'(FF"sjCibZ yŞB eѤfUXq@`E Gglfc. a({{ 3JqצmO8]מ{ gRtO[7q _Tf1`%X11Lm>~%$6ӀRD l-+6:Mb#gR%4I3JcDזJ%<\:~ؖdqh1i6ĝR47=67BPlUڝ͵UAbbk"x .#Fk 3r5Cy, c[-`hJ҉MQ5UBm 8cIJB bdI19)y}%0J5¥N(ZmQpǂ*CYI$fݚwqj GSM+VaJq*R""Y O*aHnvY k7$N}Q@KюcJ>BPQi n"S4hP,Klގ 5J$Iw\iu= K]VJlLJPq-,ǖ9R#b5{J{p F e !XbD JQERXPP|LrHJp ^S gT-\Lcl Mc=S h˄B(%qcY9|nVHuU(+hc_eN`.ԁ61pQaZSxsP+<L`yS3e"t!NAYo ] Jg,`}BB¿(B(Kz SeJ 7ʦ)Gm$mn%di-l]Uu\j'ŏZ T0?W06]:ƀ!mV{ D}cqZR&1(dүײT7Ku-&@ QcҏDqc)c+Mr2#o9HLkY>&l E#.n>Ps'2"0jae2ѩ OTWH(g8Ee!N 6@3S6 ,~ч[:38>n؆5[xf[o|yjtw3C^_ZVFǎj?.G@g0~ _#;~=(bqp*@(ZTjcA.;; 0UPJ5OYhZ'ӳMu^y[R] ]vm5;!k^ 3br|(>֞moGBJ1GbQsX'eB:MRCZ(y(j4s7bzk$LLUYζIbrVC@xv(Poj\U{pQQ]ڣQ's2yAbV><G6%DV EZ8AC= NeVbʌdLɴbih дVkbUb ʒЂπSOӚMg#2g1.LDy\ΈJ5D;Q+7oo |9Ñ5vcs{2sC+5/d1hN#sggnq ޣrkݰUѕX=cwtr;n}C:Aܼf.kH4G5Gը(\,Å7HU?˺A78ٝ`GkGtZZ&eL*#D;"L-EP>as IDrQ(ש67vȋVkU( mWyU~Rcs:9vĸ.?vꭓvI( &x$M8uэu塍݋塍Ƈm_qǏk(dLl[|pnm,Ըܒvmle*QL+:Ƥm vθ`tD+g-3A4osE[qQrن8zA.n4j}m!tΰBc +sVa퍶2LREә̼un7יZ!`wǺW.kB7Ř˂qu(Ը.66beQ~fq.ҵp"T='jÑZRҔ$TJLU 0ZwƦvGH8FɗFA8s4&vEꁀFa28go~qsR}80Nn\Xq_n/]V}9BuB$\8Vع,kV`̐/3),BwOKr沒00=SDq4ɯF#e6q261aKvN')UL)I9NIibŌN\_FgBrT3Ыty-Uzz;*uJ Qˌ"k#bI McN2 <'-NylDŪSL :ڰer.;\8-R,۴U<ՍeƏ쿔003hzYE> 뫻Φ^vtf>Za:E>lÔ)ip $ Sv4/{s9lyr%+t`dF^<S|oB^+LOlI#}ߺϙ?WGez>Cg:6wwM{7sϟgf ˗ϟiVՠgWccla>3M՟ozhf7^M` ^;`|}pL` }?{իl/P>z1W|}19zΜO7r8.ֿ೛;ןp_z=?| _??gv\+`1sWYs`:`@[Mzۇ-`657: Gx vn6~p7&pgbrpgWWIח^ж(M?B3pXkW~}4z3\dA-^k]'P?O?ɸӭm[c>n+_rXH퇛b QԌ۵$M~%ٖf(KGAdZ"< K?×83QHO(^\7ve=kquH/MGhPB).R*GR a2 4@LhDQ?*FU-. jUmMGb5w5ܹ%w.tN̍C n<{ th+Kc }{)Bo/2Zb4>wkH=db$Oiղ:Cx'gDcdo\&…rm8fh2h.wV6]t}9AtY2=ݵDӃ;W}_SY.TsK\$q7[rk{m%ڔ' bqJ(EU~>/ |\?dFH`'L$PjPD䄸jWq\#;j>|i]SϹ59?>@^<;00룎|![d0w6&s5aD ?d8H3rbޠ:\cdsC4Jb \D#%cKDf,$i5&m­7~ޠO ֆtmoQO iQx@ghF~% cZt~b#D*t̃O1Iou{lx/WGPGp A- ccU*(]w>@'wvB^#-EnG{0;<LB'z'HgOGחEl<;kgl2yoM0@x䤼,4lu.^25}veO~2怟D 8Ǵ'48/S&ghR#>0s7zuǢR B0߇-$,Ă T,8 ^d 蘑_x[iVTPpjm2>X ;vyas9TS0$D$>%"0;ٱbbEv,c;Y7OI:A!-bD SL&: BnלA e{SCdP/?c+ƂQ/2dYw  ɽ5E'W`e)tyښc̉QKxi ״H _gkb&Qzkޢwe%^NRϒS^2S|BTdt."'8ǯ IAq&đEv"/ّCY)F od4Pжx&O"wtJ#QŅ׆8Pe9X0IܱMJϓ&hdV~X !Tw~RIìu(%-< uv'1cFxWx:fGVW Ǔ=R]Y,{8̏UJ#})x;FeuYx'xeuYxOe3jG0{ x!̓H21фl<*s0Guyx9,zJq)顯ٸ ׽ ޣ'B<3L(Ar݆ óxkJwIm" , & 6J qH$!58Z8$\h^ssY:{G`AKYpL$ (VĘ[YaaD=HB|S*՝*0Lt;6 w)O27}!y(&v\.uv\㒕6 ߿{ovPI! f&, `V HDbiH,IC}fTRP,PG%C2Y2Iu!(ݪT\w>Lݤc*}̙J4J(`8PHs-ΌJJeJfS:[/*)J9I[#1[#>x3J)$A0rӫA>W(? 8a9$G诽?pbq=ѱ!]$^̀r|3&J d=/@qkMpW{?n#G/ Z?/H$_5tvqlǦNtqlV7I8V`v]V.Ә\#emD)ĻEn1K8} H%e]^f>YOhŏ1GuQݸzT7MU53D8IbDRؘ+$Jɐ3C0Y22D3\0X~q]y*_%9X,19Mg=✒8iK3*3k|oDRR&[{nE_ p<ϑ$N(F{L( "sM Cجgi"Z1 !Y„(RQV؈DI Ct,8clQBͥ?>Z}"ҵG/L{ lj2!!<%L"D:P K\51%,R1TH*SF"\PEZKbF82QM\Kg Wd-ӡ@"-B!mbnc&8U,dY$X00S1[ GHk //-4h>ib]f`tuyY[:t{9gdEn盋^w(/ ̻M h$9⭫S ߵYU* Iaʄ$Lݰ5>R$Lg0o&aV\ >0fP ǟ2Иր΁kCOT%0wea;1%ׄPv2z$R[axU觠*sOKPBO ]/ݎ).U=p?Cas򹧬 SnB{Z A>\ÍEߖNxxWFHRIDNC )e mH}"ʚU߇SޣI0y8!UNMooW5)ũIIPNG3)k]HFmW97:\yc2ݴR+N*wqmn{7*NQI`о Nx5A)> my;8 B)_]O* _0xeG W׍;_fJ'`U>{Ʒm K%OBVAY^ZTZD0mX+CҸy(bHT#0Oh$%rbv{es`7+`AD#*׸/{T%QgU㥒*9xʳ< 놦(5ЕޡKzmFB M{Z4Sϫkj{ XhլTBp?%[G*&e jWhB.ϯpϪ5^;Epy\qu'?O\5deN (>^;'ޢ_K^8Ƹ샺{tl}~>jUM[myb^s'Rzu[[*dА|*ZC*-Un'{-]q~ifVz LݞME6yИ<$d=흨>'Kߑ" dCӎhN8pqJ $Ւgj _B?#kuۺQAhΓ4!_SPDO**P\w _@@/R2JԖQ$*A{8Q@N!z(jkszw՛a['U)#D+%*j9NJE5|,uѢr@ƥrs⭐=&e9lh&8`E .0h1("͚,,t!}E ˴V,\8UI0 txVa߻?ՠOVjU^,®7ai19{ Xۡ cDUM uY]~[SRwڐJh=.O$,mBG"8Rq:lJkd&9򽾣\ᆈ_(\s!gmW瞅0NnOv AŶ(Julew PȖd4EbI49gFC..d7KU},b?TW4$Ƃk0NBH EoEWX B&kDPERkېZ &{%V&W Ҭzm^jb{Ԁc"N_ 2wl#lt |`0]\D?Nj.T,gԍ2#=!kg-fJPY~ ^p2sPyo^SX(TUgOv΢Apn2a7pDZ Gir'bWݖ )GwEZ !nb d]:' ޷_DӄhoE'wojQm oVQ6ئ^\~cz_^O}-;dC=KGK1aYdPlq r E٫mVOB1RghH0}f-.70'22AOZJIW=Җ3pz:vn.azwW[E;Ϸ۔Xn^3(+9T:.]-@~) @]ԸdhhqߔjdFQo`?n i @s_D87:iH\/e[֫uJ֫u]5fj5UϨVQ,"z UwC8Wb~wZ팰^nBBCkfflg9ʷ ހ{?*F=z[J5~~;<>Y2vE~1 lD3uy\ ZRuwp43ȿ* v"$U&(78خHɑmrZ@;XUNy|m2OFꉵǡ'Zo Ɋ9eE'EAàaZsœ?"w: {ީAP٘Gu%0J:,e$&1J KR~I0Ӝ>]:uI#3#m|bs]BPe񅟯Nƃo;bJؔހE0C8BC,%|8u o[vP$[=֘F!*?4(>VB-yr|xpl\dj;wz&?OivQ;ZsqA(r=hJK Е{o9E>9(hB EsKچvT+la0N1U d}Ab 1ۜ"՝ضo *,YB$UOPL$)am%L XPEm 94 0 ,X) cw.ET< -8Nr6v:x4S+`Ole8+hqikF9ǔ&a*\oHZ @řt4I\GWс4byGsā7К:WиOњ4Pb!4Hw͛mR zO,g<o HTj:B^In%y2FN&&/$*b EQ,Qae#lE]<.l< &At2^p2>W}L@owY%@{;ne}[{RJK(yM 67Vs*kSp_f+cL3QIc|q(=nu7u˭rkG5v`lk6~p+.vkC\qךllƊ˥0Az>ԧN߯[3o>r$ /K `ܥϖF>(0{tp??h:ZnWM!t *ѹqn57qs9SXPoqڇ/?γo9pC9" CVwe-m?>h\9h]Xz e?d,|7^C }|, B.9R{}\e`aS34qڃzuj?w{U(͹wJobo'' Bq %Kc.S CU6 y3qfMy`2GFxW}73g풴D4gޗTc7}w C"|*#=hE^/6y&+"Z/lKiSKkd_SV|9ee:.ᩳ S3?F"FW|P_`… `FQq6%E\kqb`>p ipW~Hƿa9ϧʿ/G~ixɡV>3eQKVQo(ώ_}ާ?{Ƒ_!尷G]m s8/d$9q!%=3e mIzu:̩E~~k)!8_/+g/ 4zDF+tEH;V1ҔP"hQDYZ#;?JmY͢< R!n U$j瑆H4B.fqIKO3H`ؗx::Oh b2^Z$\S\FG 6!2h/6>j#4pGA5FWIE5ҥ#&KM-`Q?B)oC@:}4G;&bW:],1W ehW[S/ jcŸaWʻd$Kn$ke-0ڣN8jnx2$ tRpD˸n;%eӵo4:@BfuCkZ9\\& o.ůn܄L}O7nBK,Rv;h-%m >T,Jt-!|c PVϲ=bs"'Kt_z.=HHjITFDy0jnYܐ a$Q8BsUj,$be8#E9hGpq!S=bN0Rf$EK14bCD)5 UQ#(# :)ϔVJю9-u[FEpxcΑϺ LNtMCԡ[,)")uY;4>dDEFʂeƭwhEc>Hm"j"h%( /ҥ^O,+8i$VrMs›hͬ 2̧}R!U+.!*LXeC`ZDBG i;r:D(m֘HIن"nVR1AU!O|GmuCOPZV 9#WRb\cN>oz{.t 6@_۠p!\>Ga iOHV*5)b-$%ޙ@M*c2ҡlɝgiP Qi 32b]P2ƛ<>Rc-AJh:x@v; ID[D !a]eOp# 3XW-<6ɿÕ 1}`lte'tz"T;kzV eܶPJI%zΊY Sy .W e,kXCҊ#i5b+=f][:OI y ;D'Q9UXoAs0ozuiylѪM'N4瞨4^x"b?495c/-˗8*_IO6ˏk_?_"cg鯛 Y?y QväSxzCΗA(X/~x:X]iV>FSv҅/cD#ߍC gr5 e o4.AEФe`X{$si}3ظ7N~sQї)NtF%V|;p:0*IC.G#%40Gȭwٛ< ; SBS9kJ'<499-aB*6zbtI2̘F :s:l|zr.nc:<覴1nze?55+1葘zU%l)XFh#]> HsnSݦ}aQ eÛM:IUw:kXAZ z`=9F)N/˓Z*ڼ<%MaB5|E8bDK⏢Mשdc@GRmxPK :m ?:&4}[("ۻHdmW=H*:>#ڶEVg}ȝ_[sti2G6jq1BE$AHJy|ra5T•TcX+@3 ,?6k<#<ˆݥKz6{]KLaGaPWM]l^wnWAOEks{f\Y Ţ.Nc3l (8"P'#SJHDh`:.ArR=P%d@9*c!Q] SZ۵GPFF5{q߹pCm(ѷWF4w *$ S) : E[gh6<#;iVyU AuAvCM&@ exTچqp=.Q& .?~Xp#x=i= @b@mvq\(j3%yPr^oOD86/gW'_o,,zD7bNdIpVj) C> C*lV3/QQ)kc@7.aKU]%h FnDV* UAVOsUĀKNƷg^-kv:*QhߝC:Āl Ԝ6VgѯN͹f/R5|1 @^/E]𝽜UW Nh4Z.LW@"Q_O4VsC v.Dq_c_1JM׃ˊӸ`j5%Jb ⊰è# ϭ^h_>Jzl7*rXYOܣ;1p!h0냿&62|3,vV0=N^zwDsN'3ͧ ,er }*`p.?[{Ĩ2dOoP @s+ES8A2Q{SAvAL=I_ֆ \Pb [g$Wxٜ19]=TDW Y\zRCeD\P3 }B*#1j}~y|kj1xvr :HbwC$AF3vAdh5YusZ+0Bсev<*x^ }4q~ޠ(I(]-|"ӳqi@* ÷u]\e߿b^u<ӯ!=|~Ssrg_7q4W7>j?3gͷwXǏ> %q4zӳf⩜R#;@eL)OV{2`\1q!PY O kol&g49MgyJ5Z+61 !Gjqh_!V HK*\΢TFä1or&ӸQq _΢ݶth* DzE+ŢRQi՞*DNN 0d@ in=Y_5ǡ lqʺMUjҁ[YI`JI"!upzl f%P$%ʯƥI5e&m:~wI˄(Hs "΅tK|+bGJ *U>)Qmb+ FML @pcAƃE%h#.gYW ͧ:6ɞ:|zr.ncs}8є2?yՆF'X4u54˔"oJB^ꬪ"RCK̝Dm6eYf!^{_|< }bY;K3k,!O=K/Q  4R{l<߷:=e΢<ﻔLmiry8|"Z"Sr@j}jo裞9Ss)sdt>yBəᓩbDshLqn>ϵddgڭ9SFܥኘɴ[1m E%SJw:]mvi=6GIoڞ9.,E.*Y.zPmEm%h)3AWDOCGv&a\ [ )*!_M9k_=^ыdDB_wsܱ!KsxurY;D27{,~+\U%*pK,]dW-(\I_as:.?Hwz&z5 8/72AH.w)_9o򕏼 z!d9l9aR WN"̈́KwqHxb{X`s$r16ɶ4ڙ"3%vn[I$zŪb4{,P/j9'50kԈݚJaHxUKF*&9OX:i4& Ak Fe >d<<^ȵZJsV!ӣxzB=R 9q:B mո\9M:no3(0wR!c/tg&/9ԙ׋ pZ&3ܿ3F 'KI,gTZBDc=q[nj籑[$%{ sgX|2C񨹓FHm}HPH/8XŰJ)u:DS #!aD(.͚QqXÉa GQC= yj#D:f=)S*f T!2qh**Po鲐9hRFTT\ѧ(xB ͆ȢE4EPb}PqZ.Ɣm% 21zݷ5yV*me.I2cF vKA褎GM-XCB^)F=FELBWzoa[ᙉx/d S*Uш-S&˵=G=bW^+. 7TnhYb$x0|y- 6a6!/k ^X?fO'Ƴ\c Dz/s+ faZ]ShD1٦v#T ]m 6X9%OG[sݸț<]`&3}Nd {ڬWh"f+9`Rʔ+׺bJQm+$ )tS`0yTDkhg8H*ǵ%3@,šH H6siGR@.ā@k"X0&Hie,ZKɾ^pu='f[wkr0cwH?ۼILQ$q?-~8c5*W?8bc:FZ~oO} $̋.t YF {!2a D+Ԟ:f 9,Jbk=~Kߋ\w1XM|h;Ƙ3/bvg ٬1KJsq3ӣJZy#2HfBYX' 40q^acL1 E4Iz±v;צn<1ziE V-hAB^&Ȕl Ǽ/>=z|4r_(f6{V:cL4gln.[nǏ0Z=)F`h(@Sסz~Uh%sUU![7'MX[O%|+a5x%; 'ݿlzl$Q.A~1`fװ R\~?; Sg7V6{O>o|h'wK[ޭ߼/0~q}}q {7~VZ 4Wgk{{Ʀ`mܭg1oES Mܱrߐn~|7p"=$˞f'yckG+ fujڌDٞd𫻫 d^MzP멫G4:OlTG}o l66+-u]NY-oTj$":ܲb(il nFxu=ovᡳjޝS@ NUC pC.{?-q2%?Y8R:L֑XsҲT-N4 ߏS{#( TKu^O)옄D%֬Z+ꤙRS/W5]gJ/Ps&\ף8\EQpfx5BY`&kxZSjJk/jkCTa0%:ky-4_fn>aXxYtM)&1U%57,B[Z ۀzyda6;+A HqNI ]5nB02$ 3FžaJT61">6!eFI}10VV2(cH(E+# RI몆m`aj?.FKҨHBaPQaΑG_p*IbB1IT6IDsVx͛Gث.b\~͹rDmShj왃 XkWil@h{9k֊2` iNZa mr⩂W5]b Ű>2CWQ}\Mf f\䑱Fƒ7,2Ƹ&dl'VQ|VRC*7bmu׿-8s#p(svWf.0Csbኻ lg7p"ʟr-K}KOJ h86^sۧE>PR4-ҴL.%ʚ|}wά%ϵH`&CetKIQK1]R*>.c̘)T:m&NB{|>,γO?z3q)~>ZPm=>Nq'IcҺ;xFpߑnF^;Q{htF -Yc%}(S!2'4&FaƼڲO `9T5 8ǟ5K^RE u9Qb+{1frM+NXsd*Ȏ+ɜV Ѣ XcD2K2 eX1ǒ|K'EY !{,9ZTyuh9wv`txn Orĥ6UR"G!zŬCkʘ jpO,:l}ztOOcTqZ_L I 7f,fs>kL&d {.O?tW $b [Q=aWJ3Y.U aun'!%rfm"lLF{XaŘϵŠqNI1̝\Kzcj]+V8Y(Zj[O5k (,޷K{\!㻏ϙю؈;Ƙ5)o%eUW CN\'nY"ɸj_)ޏ2kk2:5OVj{8ԸsFG>: ~Yciaozin\0TXSu9kT uM %hcƄ`Sǹxa9c NmϸP ZC,34y{P3Cͨ5wC?uHqփL@]^<#Y;|1}r6㧌AgVc-3$Fw6ZEzk5)&<["i}?~yj lľA(O7)&4rF`@ => |9 M43G:*f{O H+gWv.CAޘҴwߚg4>ؔ}!V&`}oFVvB#.cxNt]с==2B3߃O,ّO`降%fPH5$:aniDz ( E UV>($2%e(~taWǓ&v%\R2*|d2VS$ /2g*\3N'I-,]Y喔%5UlOVHVs0lЀq?n̟5}{Ot!].߼Zڳ+,Oߜb+ +:ẽ_>l׷[! =7?Hc;@q͏]gթŮ/.F  &8F>ߺMӿʦ4蓍JJ75QAڊ_ y"$ST nɽS;zmS߾ۻ}l[Z敹l(7y@b}bp}?;1!yy<6>k @哧a&J'l%gRh=s,1&%%do߼}G\rOñukٿ;=;-co~et`kƽ Z{O†lw h)`SҍQ&bD+.JSY fx-ګ4x`_|4ٷ#XQfpscHx+ؙͼRU Q+QCX\#zCϛ#!LDHF&$<*1 \e%4yk3wX \C{(u? щ8~dEMV2Ҋ`MZr)7ZɱI\kC7aCRTunuU M`WdPƜ'VgTu66'`iݭ[<4PZ@҇Z2)[2&W`9=+"͞k&Qrb kv@Of+LTKnG[/.ggQ_MgQ_Mn{e]% Oay$A*=a6*@=G^iɍIQ>jV :o7EAߙmՓx0_8o_*rF-} 2[m>T׫2Db$nLPrD\h\,Ğfʶ9fv\SH2P8@EȰ1heZOL=-kILFm/Xjzl:2*0M0SJ5!82mxPGEL& :?;~y|qf߬q/RX+c%D+E{f?M ZP INp6jB^`3 &L8]Z+LZN}TIzMHP{fBT0J۳"sf eDg$K'X? zWX5*)#JkV\Qܺ턟k(* g{ 1?;`\`f~QGLFXt8no7Ohv~Ѕ'Pʝ@(w t02 1A9absIJ\Ud#밥E(H9H.y^'}ڗɊd~mY܁c${^LdnM)XziNuG+uԝRwJT+E QLjqBhcMI"XA8!0lmW.e7tWP^ûF\Sׅk5K'o) 6px WWlԆ+n YŸX=ƌ2y.L ˡ3~,'trŅxP9Conޛp,nm_^z3—K'9ܡ ,[Sry>'|;KRLV6Ff~o湒nVfA|}Lw"ON%G TCv` Ȩ >+fOf/O+=o*nj@3-Vj?6x&@9FcHVjkfkeԲ"$#la`D 6sY'BHHp}aU.L[렭߻nOƃ,݁e2Q47ZXQ%YC'C(Ja]) KTKv9)xn19 *0b_|[%9JX+e"`GxD8~`:/p"HAVA% (HrF|⧬5G(@@Q0 $EӋA~0JO]7;Џ%VnȈREZA((5%4a) *Fh-L/D_K| eldb9#&AP uHBTa҈DZ|0trV7agkpLO}ѶP(q(ǒeŅ 5\]Uz$U(ř:\-D1|#B#d6 e aBE53b-a&_f$E5Ba/5(Kqi{>sx48uVDqy1",PbMw֏(o(֓'k1q4{GbTtL7plo[^*C16pt6HQHzk|T~`X)7J̐l!8Y"oIYi9Cڥ.U X+s_jBȱh !tcaff [= &z6xs=0!6DFYA>&f.!4Hv2Laiz4;( 6 L4цQ9[P"crG!nwr3j'@+Ҫ?Cl:nP#0,[1wzm"QߕM8sDZ_a'1|:4g_-{%ɕ[w{5YZ(X5c1I.FR $-&K(2*rkE1 (3pBIa"1Z׫~獳n4 a#t 2ff:b%no5eyL|A}DF3<ԀΖ.oy:1Ԯ51|7M;Y٦yzx>gW߸l P=LD$ћ|ߧ0|줳岾ڑ0%Weꎓn4+%\,_g&YʫSj Yeƥ! L &ϵT3nM}f4ҹHh Q,D О٘[nIdp'ӞmVӺl@5G0ےuu=🖆t0sŃK$;й ba._R>;X x}0_~[>|~5kހG0u#>~ }z_^dx}}Cׯ~vo.[a~:qǵ?]oz>sPK <Б37d;~NotpB'\{0dD_'TĖHሔ&IN 9fX4V+1 E/p;K&8f%Б.N /PY%_0,O@!Yw>kk0♮-GA JX#A0(_v%F>ƏaKh)XlG)=VS3;nJ5h$iTMT ])5cPg,YP, |8s8~>hf=HagQPSs.hwcioh%4@4z5:AJ4:mU&O oP]EjCu H?E16eQtdѵ3n*I*!.w0{9-#>.: @6$~aׯڐTCW:3eCrSʸ6FQL lc 7$!{4nJc-u1?c`kr=P%mx7eJo]. We5aTF8 U).R"`RkF;d+Ryne<"[`д<.;f6spmø(,V/>wKgu`8Wvb[ r^$2 4YE4Ƙ߁ZHS&\(]?H =<Guf7:_(?s-?f{χ{Df _|"sBk-2_b7M_|>Wa%j/kɮv*$m`pZ`R=LӠ%JrLGI1>a]CEUX:W5dc׉(+ K\@cvF=0[34z8F#rK 2_[$R:'GQ7th &rnܽ0#WYA3ӯF7=`O&NY%Pj_:@K5]hy=w{uo pe-Ʉ1"nVrQѰsGt;C񚓏bg=u^L띞g|UW:& `0t[.^wq\|ύ%eOvFIQ8VbαS/](VbڼSV~%e-@$A`ȭ&6sN:bi-Cr҈1-1}.&$j5'T;lOn "4.N"$r!4FcI*' ;`ݒOiGYK>utԖp=jD?2@uJi`5@Ua T4l86*$F,ʭ&d8tw5JTpC"8Dgu*8D"p($ _ދ޸'{4 I>3U n !{S ͍ 䅴MÙMn5-۴4]TSm۶T *;78<]a|ְ][ljcR%-bQE$hyYxz S秧pvRrhZi,Tijo :sm@"jpdQ) 񛋤ڨ}˗f^ @e'trN_*DAth6%.ڬ!4㹮VWI&Kf\;y.kE)n\dk~͕vS|E(s*$I"IX us?Xv̓Q9I(2FD!MbqIȐ89r&J8O"\# c^xy1;JvcB%Ff7M] zWƝbOWC@FNʖs֪Y;R:PLJ@8"}(kL[ȈG_(_P}yك9RpWO5@^k?v] B*1d*Q@i(xJT )D&xI?_P*=km-qwD*SY1C948F'1(N@YJ,-DT^[\q1dW Y~;G'_ke+mX O(}bۋ}ˁ8ɗ} GK3l!s5I0k.կ>_=+VgQqM^ϭ՟_0AI]0 Hs}2  pn/L V'TĖHሔ&IbDCXcpV+ts'bɄoeSX<{^}ȹpK|;M'cvG_lg7pe4%>I@2Bfwk,[N>ַ%W^e4IGfxc߾=(W\:p-qk$S틞}a˽yz-m*X~S5(Zt8O߈Z^r7HW_7dR>笡HʥWj KIɱFoN`c\LogQq΄IcEE RҌ M'G<|xxTfąz ʷ9Jg[\-w'4|3‘FbfVo([HfQ 5Z"/)VR"0OĒa)%G$%\"JN3Q<(Sǜ;F:(ώm&k?Mni,0H"cY&XcA~6c8V&&qʑj>/)䥳y9(Mz]:,=a^#gp$0_#gp2\lo8hr#uvގ$ے{/ڇ "Dلbs".'Y%>>S(Aˁ=/wxhBFrkdlbg m>?r1("Q}"%"L .2nPӍW2J~;cNskguR[P-_\.~5տ-S]{5jKWSA~TĮrzt9P|2r͹~ۙ8$JrD$ok22pAwhx7sP} m r8rD.N%s|Y-~cY!焑i |C$Էj>p*?ҾfXYk "8í &f`|6TRHQ)H=J})eGrb$Yt|.=I8!yL!qPJAZ3L8򩎬wd:DJc¢ƎKHJ!;Ғ}B| ]* !"iÉyhaq :G#L1VTuI@* xzfN "+(򛨠ؙq_C4!|\=/>~"jM}-L]S +65 PL˵f &z:"g ؽ7f؜x\ ׭BK֭u@q$)QTElg-,qܛ;@3\et ͧ~40e9v˙_FY .1F}H~DPn'P?^#;B*p8k-<=0 B<AMZpT,b?&.ak*M}^qc !%Cx[c,x+c2htxb~<.]̲K?w`ߔXY'| ҬsMHI~MV7!c '1b\2mQQDב6&:qT&L9tTzL`Hw ڣ26 V٭nLFHuW=f6^>;X}Zˑ췡YM։kWR) Mc:]FK͊t #,Ƣ9P BlXs`pD"^"_~Ŭ͓.b.e) `MZP'%sJ$T"l¨"_YQ$u1&,[5y 8QǛ# ս,B- =;JoX/$kg_#RH F6?ᤸnN''pAЅ\ lRtF@)BPST(=ŀ$"ZfrA_Z+|&71<7&1{Ϲ},PsB1Յ沯Y<:{u 'ưr-9GR{Ze˩_ۀZ?vt*HvegWbuv֊08wRy>)Y .\LTg,MXjJD*l6ڴcЊH( #D7.$'L?!Y+*N8T0 duJ-cxCg^k)-H联o bF+Xolz'-hV\&A@>0 "֕M .R=6G!0$i7x0zl`F)CaY 6¿$9BETqD:{AhN0S^V^v5ddty%C-z xEcq=~-GfvU_l4F89k/4bgì9QDh䑹\hN1C`}ynf&9%4ްވ2zO9.9c!:g-,ApX돒/ KׇȴDem@.sͩD3̂M͹ ɗ39:a֝Ti%i&X&Ќ˽&M+U%5S`,͝4ҷH8v˒=0ۗ6xH>6xaѽ!w\ZR@|ۻt䷀Ikĸf2V̌C5TAFM ގz& 5ŚMFϕզƣݰ 1?E͍m%Is IN/9#A?k3t7Y / LJ Ŵi8u~Ԝ~;M/@\pY<'hAw.ee m/W'_#Zlȡ*tЅA7$&J#*KďGw*ڗՑPJIUנZxDqL-}(rߦp0 :cWG0WY=M'iYV zBB-c~>-LgX{I !rlY5{Ճ'VDMV!UC[PDlAj TH]|of;S}s/ph$Dc 1-ݨ*͑8|ezUAV\^#$8?I!Nנ0hQ(n3$=ef}NlLBcs@s 7w/XpFǤ`_Rp,$\ߎ5HqҺuZfR$w.([ %>ޝPX⅒Ε CDxm.R\|`R7luBu9 {"ԍnHϱrӨ`mNn$Q8D:`֚Ny(ձ):MeZD9eԂn-س{}d@-jmUE5FXSŧtN-J*(5H=CxbEPfN2 }ނr'`q!8Ljx~ISp({:BV\Pھ\CG1BX:QєБOe8kA1Bmۍ–[5U`·Wu!UGCzڱ6:KZ %7ت3b-EWV zU5MkbrF;v^c'hzC8!! MVۥI,T2|0he+% >!76L0"m )mxWX^\Ykbr1"VLTJ>E{_eǨ4I)>Vo+sMH;DfyB>n^|u!ZPL)rxbis. ~'ܵO?z,cq^?(k-4wjkt|gR!?L^}{sZ)n{7'~h4 ADHT;zM/ly/f.A~׿-34lu_w&K:O` mG)~7;Ji@ 8<\/c7vE^(U.$#wWxʉ#b'jh)}#]# k bK>iF/ b%U"VB2Ƭ/>eU#)0#V'jlh0ZiE}ZDOMQRRI).^6h 3?lWy+U= Zxq3M6K1B M:joDfZ,N4cǻ_oEv:7 a fګ hc1L'XLc!Ŝ@ѝ ` 6ҨQ!cȍw~A`fW|K ٓ8XD"p\:0 -}&8OJ;8ǸXDpS,4$qt|64efG/}*8`Ͻ{h**  rH(h!ʞ3GA[/8] #g x W)䧀= GֶA/=7P-sLbN@l5؝yLwHkЬ  Qew#5u[j+< ԇ:u0Dꡱ򼸼4U͛CA4TJ1\j{ZA v8)R{/k7KGI.uf[JT1n\s&9ܭ^aV bɝi?J8jJ"(fM?V"(a ,@/=?\:H߾v~

ricozx8w> y /7>ϋFblj g.wm[_nRÀƹEm4_5Ɗl- )K#V EQ͜9#0bbRI.͟Q Q?]j<&0 Vvy(}U}@718ߍYeDw+#5gV1֕1~ W󮳅t+d2ٸ8 xm=rZ:[4):8ȝ-E ah2Ihv A"yFE@sOE&lP-fه-bX4L2`%=[:g$v2. R\hG>?ГADby3~l .?{̴K~ EӎzՙCi\?cs;bNaCe8\[@*6};)9j` u^87ACs5 nqu5PҶaLk&L9sK=jk:F<;X WuޮЕեsm5fNXU@a%9*ci|jliMQ}y g5I˖&X)]e7g}ٕXѥ@.!t X ӝ U ir:`sqH!uY"QBCfU{s+wPKJ`iֹ)?ǍB[ۇ4Mt5X4hAD uJD(&k+H%TnZį T6J"椱S+o! 쵷Đ2(sOFt)vP fBP~h9ߝ|Qm1P.ivؿ-KApdLֺUOl&k @(iQR,3v8a7'6Ac(tÎ8yiad哓YH mQ p~L?izzkL]^"'xG/:_xucD%VVé8VLBE0A H)@ QD} +Nb޿h>$`\oI[xh훾s/}H:4KPKL;F2ygOnq /1AaN q85ON,]F1=(QO؜syXΟ+eZ6F̂il!&)n VveƐ$M%3S~cPRA wȈZb lk-2;9ZkD!*Xk-FI|:>皯N5xgp0#)SݿJM-()Ou=}HX7ϒE4cCC,i(0Adzi&|Xs |OST*( Eq p՘Glnʾ -`/24# BF@#xsƕRrPP2-= V!a z8 +֛Pbװakiaʱ3d06s loBA>\3_*SgX{oz%1￳?Ko4#o{SoԼr(o{4z5(pO]28ȭ(s^o8ǟL*-Zy_a?1X5VmJ߬Z%Ʌu{@{hXn8@Kn?VdCƸpd3egPbXqP;X2tsW05owOcgs5WOdᎫl,zˍ͚`ˏg}bĂ*J 0PP<<.>(Zu!CHC(J"(ZD12U8d^q2X-h53anqpnY~U D9U'Rb A} *C^7պ5Է>{͈ P2VMNi~ N1t G%Yu h])4©LϤހF;%Bb z? ‘h􆥔1l.~'+CqNݾZ,aM;P%ĶRN)giߺhpT (/7dq7R\8"4Yl o'_V'ADgAUbGIl\)Ƅଧ_Tq `Zpe4W D Cgs5ȩu2^uEpDf *%pײ9)lo,AbVK;-QF `DbCNya$CSG CDOcf24f1SPg 83;֝o'QևE웇>%*թZmGYr1TlFe/ZDok9|֧d'E9tPLY VعXʌC_ * تt)/]( )a8 %ƼlL06Y﹊)iStZ%1XܩΝ'WXO9%q_$Ylf*Lˇu5J|A$K>2w果YK0;#0[RJc_^H- {Ie:hm|=jke4wZZӗ>՘`U$j-AwN4ń93IPo+A17>;Ex3F'քUs},m52t3$3aU\;~" )N t뷕DTu }A2^{É?mskIW!9y.18;2.7ON+/Yt/d-rq, t;,di$ƭ:!*;K/T Z& i@$܆eSLDI'r/PhA.ւp}-ʦKMrF\Hw醃꣐H,Z*vUV|6@9JsKf~xg=쐢QbDNdZ m]s|k Bbjeq4N9^/X̏#]Kv4ų* "T_/l< ^cHB@ƺ|^(>Ly6J=vXzf _wϮc$ouJ#Iy~9^o8z> 9;+( vû׹yK\û#+]Ro8jR{ 0{i30V-*|&]Di"3H[7n ث{:)kSQꝵM0$yܤU!8R5E==<u_oZ£<#F81Ѝ!inV: ژ Q8#x>Ü`" kGt`yQR>z bAk ,-k!dnB=m mo 6\)zTQWѺļ `:_U1Tnm`S%S6C5v L۪AYoZn^dgc:7[M/?H-B9ޏ4 h?]9Ʌ_GS}Kz3//g%jL IFElQ sC˭*_@es?e7_5 )>S$jK;'> b4Ac4O ej$0KtnNƁG?f) R&%0 6S* 5 -SCe冄j&eWoӍ0 D/bAZR%8rEƍ '4a>W2DkM*s}Z"sQT`00)*>[nB/XSN0hP@T`7R=SDCE}z-)8=)!YX*| ͲUxLv ) TBP4 yLiJ@N~(~)5ͭb TRķ&|kc9'[,u&;Cu @׾v48@s0b1G <("2& buPjFhٗ zvk _BsPm>Oܮw)XPRjlL4ahٙTmX6|8ɭ oqN(y7EpT:Io+ف)a`4:o靚4,d~x *-ꞌ6$X'Z~gu'[{aJ=:BJxampĎLW C7k:;U`p:TPIHpG uv0QJScGʦ5nTDa83DZP{_3tq0N &(4(,D~&{& )#[q"xpHxbX  j 1uߓ -*kcQ`K@vYr;J\3XWS۩)MI*gDB"B:^(I^0LQVS_{NTNJ'fO&)57X&#PRQ!3 Q&$V $S!UJ=Ђh7iT/QoDS LLVPSl%2;@]NKl $-B gڐnl !fS ~&tF(AB&(!W4TrfGY",P9`Ffb3 H kBlʐZ x\3alw-'Aȵfa’6jB@DqMSVa%*  !39/YabD9 (za~ЁCDl5{8EAE^|I'$6ԕ;- |_ fy{Ͻ]{l0ybi۟c>+N9'N.gudoZ8U1̻vjN?޴ظ.mq]R]yuqX<\([m;=u^.Ʒπ\̨%[ŀA8'Ic"`dp@8 M@֜x\jd wE뮥>9;.9;zV[],'aqv:ؘInX-m"g"&/-|o.Nd~*l|{u?ެqK?}ZЫpw`XF>ݪw-ّ^@ XU#o%AIّkد*\l !Yb㟕cx-)}Gvn}ZݲMe6!)g؍J̀k侣 }-[Tvk!9-;"Cj%Q$vUm\ 1u^Σ̻z6+@#Jf8L oKb⃰RZ4ڠ<S& 6Ep>2R31`|m)5 WY"[LLqxǕSQ1yZ\K>$L #1!2#;%UsW&OyFisM^9 hd'H X0 ;nTh̹6H }GX#Xҹ~m C4 Ss8rmwacr][B9D0EٍW1d\@'um}ie5e+nm C4 S\4\ֵo.f"823AƀVB4};W0-S)A9 #*LЎdXn :7R KyzXBjRHARn)tFD ÑAP,|%5QT6?|_\h%CHѨ ^; 4n 9NsfI|G!)Ns%K;- C4 SZ7vV4@3hsL-i{ݫ[$lOsC\ws3Z*Xz@Wywﴜ^e̒rUv"/cN{# 򄕔Ne_WþL%2:/GiYIם߽?7M~;]9hʐ1Smz !޺]%?U{o_78gZ\ O- #aqXs$N!8 ǍBr,|z1ǑKT#D f+|(}& ^X.#dYR='3rˢЭ 0mlr훡~Z?Qد BqMp1,w> Hn!HJji) Y·o-pq~zO]?OC0!oŕxڴJxϷWO= QR0euCr^ݲ|kCկSCDɇ__Ml\ml~|+~~,W?Ϝm|dqB|zrz=z_J EZM.~z++my&<+UF0!sҧl`Ԟ ȔNxʭ$wlwzfe– [k pT5~ Iٽ "z'qInyhw9'ޮQYUழصN@^9\CK9^tpsƴh}.v0|\={ό@콌 frl&O+ޕfOf뽤9Ÿ񯸟 MJP__@oR·W'WG(\œv1;%Ԑ 6XQl EG7F2 dN(MCZ坞;j+׫aWրa3}ѫɦ"CٻF$W >bzeBaj ,RR"X)lh_DƑm/ }疷aWwha&0O]/ʄJZsjQ~VI^'zuUVEG)2Jh`)g%ZS.@QkaV(-O$u它-ʌbAvDzetb,Ds4/5m-K3Mz$2W9oZ}فx"*K*>X+MG| 1)z&>i9ox 3y%vyCw9%J|L]LtR/!:+KK~gD ۷# %lݭT* AWIG0 ч=w2]~g01 fBLp 1N0M NFQWk:Z#-^Ȑ [.¤- 9˱ i*{Z Ah7&,`v"!Y]e&r4֍lտNC,F5EFJC|-o0T5Zuv_y~ʥk'N|:"+;A$|VJAʜt6Zu7d%54h~U3׍(w3;DcFg@l7AvJ;R(f c(eΩf+]EpX*EWŜ:/鍳Ey`TN,5Ef׸d ё6?Gkh~nQ_էJt SOqf+^D!ވx" 7)eċ(CH^y`.*!ĩ HHsMd'ۯ⷟?:@qOR%EyFB5a&= 6B\@F:)Pt:A#$Ezz٫5n3uF$;Y4u3Z3:Hiz\!kT.0JD*cWRp -X#jB0+mdJ9 :wjۯ 2$0MZb +IM1z\^'VieR()ŸԂR!Z"`I(BdLCD`u.:Pbڮ-JJguu$SaLpɧ(2No o}Ew?uqxL^]O>K%,5>it"&r_)sulyO~jfKxD!h{DLit}ϝ@Uߕr[ަ[ >ּ="^ vBY LjV6SUs{]͛ד>,f5g.`/t&/4?{(`%D/ %+6HaEZ.^9 l!FwQ„w $Vx hB[A'#qH,8sB'!x'>}؍` \L\Ԇz4j 92H583&r9&^|})NȬKT0FY<=FpFCs*TRnJRv7jIPoÝ՛@Vzim* lTl*H@om6fgh"oʵh7:5 \I Vl?*}b"R`}LҮ~Ŷ杔O%;ޫGj|S@trG0E$ȝT!m .SI*FR6r :c(uAn m+1T*%vG18bw{`]VǓX; a+bZR, 稈c:S} ~ZB@;pa;-F8C &#)߇ Qk֣E4敎!jd(a+䉒> F.-Eȝn`mpK9m>h9% sH|: O4gòh9pFjᔩ#zL%)Ig x)K )WNYGlu=UVNZ6JlkQp$SI^󒮏 " m$O!y)@($NH&2Oj ,RF@탄ɝ l-wֽA юev~õnt:mڑ.â# ) cO`- PtD?0̮\L4Rq3xD*@ynTG?KOkS" 9u"Oē2]N/0lf; _{r^c|?JL}y<XW|nǁv4Ȇ=/!& @]Vx@pZ&)vZZ'h4k=>e)KeVx{&r2U4HA@@xwQ&Azj2sse[ wMA4iV˦l9[UTt kLiM~7ťU囵W83qeI Ri[ߙo 0n3_&Nq쎲bRgI$Ç?K~e*E-H Zy\(TtF-w#\\zJjpC *z.<:GIFQIG.lo̍8=fIdNxl:G& p4bo-(IIT; &",#rT qg1^l>4G:BJOh 721Wzd .4aHr nld&A CcMKmfMg8e&Z+Ewi2҂e<{R]3I,8"uCpli{{< y}B7W0o~qP2W܊al]#Wx=/ߞG4&͞nԠr:0'xx!8=Ԩ_'^+>"3Z,JLK@4w0!m|g?;##ӻĚޢIT2NiZ/nҵOŮ/rE>OWjiuf)Mc'K'h5}-$ArTv@P{Rţ7vm&,ko'(G|!sаT@`BZqUB7^>aG<.9JM3Zi%iFM3'BxAt>esGwdLuCcZ& o17s fZεh&R6㯿XL'{(7Ce{|z AZf $|a* *eɑ')ܽ:&ހ,ӛ;XFϽ .z`BFP+B\7)f+0߇oU)g5ؿB_yb4폀-^*ˇiAv L+iwLN߻Mʉ@GxtxmC-;#_qlDr:)x8gRLxYM!!L$,aZ`VrFmVz 5A`pNC,ns;n29Nb֋9ۨ`bӱ?VWlRHҁ`.GolS3~?l\({>4^/aHJ C.{K5p{M I Ƅ::,҈$H)/Ḓ>p *cBk5G\[6u>Ps,@vr$vZ "qϤ%\2iELHYBaK 6`K2%JH!tӭqɀs v{_zr%:Gș7 b3XH:b5o<ֵY7o KCvÔ:hl|jtm߽;F[)4bwBvӣ~4hQ݋G5Em ۂ xHP{Q–Lp ̅|G޴C\ ՙ.Uoye~EDɞ=&gzH_%v~Ē&B]%FP;3J2̐Υ0fթ:Sn5vp+tǹFR]6cHRkcb7WJ ,K:ZW'aqw*v4a5aݧ>JTco|SgD(2,JG59v--IpٻoxҨ-`B` ep6)_w^]0VA gu4qԎ#J$hLܴIzCT 26IQ*럪G}_!.RF%ߘۧ_!՛ױ1?lmOL\SL cr2޾ESpK ZvNcTF磴0~J%Z"..Zv.D ;QJ@TS] %9GJl߱(c*6)Nx1FK.!̏%4 uSiw ,yDkHWѫoa ](zK8;g{!X*\kkm~<|\|n#F'sSALA ]3#V| Ƴs'R QZN28반辽"&kTEL ,j ,XcqR! #1+6P"hp/xxi/q"zwe-G05SҦKibT5`+֍ꟕټu%.= %o-"` #򞩠Md*b{[b(q:D 4rm !+L!5/RלeN^KsdVN%rQ5wզ QD !/ ʍ.aa,p**-_ZV*9g?9.?aP>F -.U5cc|O`3Yr\?lHD?( 9- ǴR@[)Ŋ#}=ߢYف]3dD C{q8j'>eqi Z2tpck 2ǒr2Z< {hQgNI IproD`,(Fpjbh{DGt=P53':si\9HIޣBDN`) R:- / I J9%"AA[Ĝy%4T{'#yDk~">#Pʂ_n=[,hn N|M#><F㡝'X$j, VtʲP8EϮF xѵFޒGHٷOgE1H+#KQ=ZH/ߦdFYj®(qLe#I۱ƕŗ|]7|wBTF( l@HA: 吋1R4vZPBD 2ZFw/w[޽TRq0mq,ƩguXZax)G | |~X}G^0 | GbFp L0$?q"b2E&zH_W*e <BUDwk/ۅH #aSklc:)p4@R 9lTDWfFfCDEnN.q֕T(q6-r\Ĉc?@##E4iD |(FY3 5mPe(+꜔%B "H)%BA':(#oН")jo)?o7fbYb 8+^)ܽߜ%TB92p-s Tf%^Wa؜smG~Z=L&lf-lm0W>ge 8g-O^=s?-WEa(vZ~|roV'+K1fun#t+CvGrhC=hC~f m]wnk{c^Rɼ T|CT3ơt'QAAx#O06{6rD7"Dşa{~(yrXm1y䶱Uq?Yum3ĒR˭iܞ.,Άqp>֕~;7gcWX4>b1f*j7\yۖĎ=~ջײ^ &@#7a֟w7ˏ62u{uc6Le޻~0>eޑi#R1)hn-1j9e1O[: cQH6Bg[}aW_nVCY^q:t>>f7o;jãQ?{5"Iō?6=k{KIJ-zvl"珷2jؙ ,?@zU;?`L=<:$ gaB߰_/q㤦XȖx[ /"Ɔϥ%gSIЌsi]:ј"y=q9!Ƶ9_$_S]SH>x]A-B'\"#m9låh+#%hHNɓ=0ʯO^lh}V>vvT~S js戈|f͜\"9ES;s* X{X/reyKfqfh?u6@g3DmQ>wӷ/0gk=o(߾'vQ*Z!AFV(IB+0tmB( 7NSd6}C#s;w-;_t?IR=q 5Av}gYKdDb#+bydBzVhϻ_"Y=,S#q.S9]ma7"r2{n\ )bV@D: =2 N#Oq0HNσ]95y3+g=` ~/+@wPvan`H|V(JY!;Oph }U >{<| MwzA[(zfBuB].#S;TM.r|*T=1NH:V9BS x_Na1NJ׌rƘ50?-0(KaDtI· E6Q-F VQI)%B{&H[%E<(Lj/cte#q$Fk|0r(#b2ȴvB$H(-<*"5qH #qDJbDX[ޭ@B(0{;tq &JXRc.ݹp 'o,@1 tS)k&eڎfx}$Ω+g P*}Tf{zvS6ʹ ,LߔE?ޫM>wvm%}9 H`zQ+x*'ѡ-v3YORv T6 es=sdWd3zCZjt4meފGE0YEǪj+=iEha:9 $w YadTAX=I]\[ϓz9 ;G pI;:&:ǦA%Wp~p;ҊjX%NF/- בX5]1rq`0OYN.H/^:dWt =ԶևFn j$kS ^~?-ׅ! ԭQ8|4[ӯiqXDw*wo~~s2-m47Do@ξ\i_uBZ@ߊV^,"6yDۛYQVxltڀ;[z9?}ǜiw߿ao^ulWe1Us{ys~08lrn?oϯNFimYBIq{tyg3M5{'UygjA :B$Gߓ>Rd)>_w*rnm3 D$z$ꨒ#FieeAPI pߚj- mey+_ z[VTyTJ8,TwuIFKtO:>.P%i8I,9nS" غb 4NWT@˨z)cK^q~3>Mcxg>jiVARz=^q游 ˳/& ;z@kƏim wV@ JFe4wƸl\5vz(-FBBu^.dܲ 8zH\n>˃Ԇ"1J]b;Zɤ`vvo"7MOJi=Sq}4S3hT7ʋ`~ h7Ln,.ZufILk# X1eЦt$,` pI>S\ܢ@ZJ5dOz\2sukOzϝzsvXc3g`/\$juFIYn^77 ~D_M H@)\C%EQ)8wQT jZ9CulRlyN"Sy_Q^_VQ"=['y/74F_Q3I]]6#';VN̺{rj߱<3M~xGx6?P+˴;.Bpoο+αuSPHb0"KIwyKDuOhNI*`8a[-%S.mUi)[5֭ UJ{oMѺbP:bE`nՌZ>4+Wu ^4  ) BU2Ȃ@\{Q@.fTt+rE!"o?Ez~Z!Ӓ'<BM'ؚ} R2˜$.'eJ-J$55h!ƍbryB#YR2IPrD;:L'kr%nC.e+DUޟxq/ W?:,z|;_ϕBpAFb| A4| QHwPuPJ^hOe(XÇCC4=C} \|pWE$ʹbջ\6qRGD!p 4$)~j1i <g. d o3> xI&(BcKy9'947r"KN!^6jgzto֫1T6S9eH<})[5 @✖ے;~S91B`9 H{e>l 0T|5Vy[)k+ z*_=M7~6E' ]q0RRB%Mz]5i7NK!3^-yjInAϞ^_oOt^ElOV=-(x~xM-f9K \{\gx77СRUSj,+ mtDɚ'){n(рk#ZjM(8с5J9sp&IT9*@9H"!4s\r Ȭ &s;%@N(1>hPS^<g?s"[MwxocB͟&I?/~^dyykHJ=ˠ#Rh` \NӉa4(A+>EU־|C?hBS8]؟P7:4جj-frAh ;{\ȯmE{ǝzu_$5hgR2p29u:B55c련O`[}jVfZ]T{UH{ n=W.b7JJ`;e*#Jx&!D FJUJni:se8,dЂsw,q;1ܪus;TErm &~vbw_탭ÿ->HZH.}qVs8yHM4P[HLÍ:H!DF$nlMH=;>4`3<[غ?- {܈Y$Q5POv=v`ʹvF3; W8JjYϞ2|JesbJ N*&::V{3YRڣU)Is{ݏ.S^hxwdVH !{@*$` w\.zt&_m5=zqoV`jHFÃNQnbQ y0\pE{0KWE `( 6f" IF"C8jF ª ).]}QUj˷yQZߨo-%P?o6ƾ=,6ita4ish|oXm슪殆9hBˤ[,%$JYJ&i)@tzns.&Ky:Щeu{N|~t.j. jcqRFk,ߌ+l%/6.ث`[ǿ ^V7!axQD[G}\YE79Gᮄ˕V;j R 0y4M{^3@&vNA,L )BG (=:w5]n>;{ZQIFe%+~' zU@F&ɠ3#- ^ȵe8KDl*jLgNPF%Uj@ZrOlh|Ҕ2SnB %y[g,ͥ,lRP؍Zue+B9eƼJ٠ՋCr Q"phk'Z:EnE$Q a tFDA9U`k٪FJHlpx+0, Q!RMZ("9idkilµ>h#Q8\kYZޮ`9+ mZur{ѽ0\GA,z}4o]\gIF_q%9t;8@[fH0ND +=b =/.`IRQ-Cd,CA t@Sοح?*֚b7HpA  zNOáHĠ,d"Mn՘W(dR/m'm j5wUIsB9+8)b]N_Me;rbDr]=QFa|`yl_ x42~,@4ɥ!Q7hz25#] q1iޒ^b"B켠*JL3.q *)*@56@w"9=):2j̷W6$M/lbWhvۛyF\]8}X}25 =FDmDdJ7ӳ؝^}YVޜ8U!*ݮjJC+yt(~{^q/ט`x lclum5 x eWԀ ON [ZCyAj1c:rz,8 n2RQP/_wjXxnIY~ٻ6%U 4R߇^k y" >$<# [=1:hp@HLgutwu_-Ьf$6/)mIRM-+nV#ɯW$_%cL&gN6<\h\ww_FSq.Unۏ\Z{ZEnlQ[$$pS*T BDe.g9 uͷU5JkmN[ek̡ og$8Kð"ȯ}˲ BjD4~UlebS:iʿohjEn!&8@Ys< 0mqȃC0[idI<HݏI/$K{ Ck5\$5kvG_ZkCӾT "KPG]h!etLl=󺕘#/yR|J~pZH2Wp( Q0pS3$h)30\*̙e[!(08O%R'P2u_{Cg1k&iQ_k0P6ʑ1NqLļ҆Kip9!m0B/VҢb/7+cZ sAfr]ٜV-Z?lM|'Uԃ&t2> RQq๤bvt{,y5Ưj(E8䪍k~)KzJ;+ŕvWh0Y,: {,``X,Q`۟Dr)vwfzV{uvSzlDyIj:8pƾDV^hIp˻ $Tɯ{温7: OL tHҨ\_kjAb 5]VԶ7M=H1Am 5uGtI eL㑖UЬPS]`dz)|<$Vw rF|3Ҟ'4AYض((.8& -?~maX=mfq{;ϖqC$11$chY(.0/r}@V8`~Mʚ &S \NRԮxB{[jBE/jmI)JH^ľMD#P)zK!ZaZ|5)-U~ESHw!+)AL;/W緫8?ye;= (OXY˘!)r2[ƵX͒HinhO#tBoup\o,=!:][o7 ;O+pKvd2X-2if7otjOtk edO{} LkPbwO)j$ނD(h:WhMdmgY |(Bb+{?-B1Rm U+6VX4X:w ˥dAV i,F$GE(@9c`! 2&0-gYC&Nꅒʣ񶨼=KpD+V/EA!as~gqDQ.90 b8G-Bn:ޚ:S88S88*O1u>-Rg[${+(WPED *r !--wu>6Tg8>[_I['XY%&G<'J9f@:$@3TaԆ]2eek䉼]r,ke%pʟcٸHPjJY]$* }=1qn?nw ".c৻; 0v8G3kś.\HC: 0嚴Pu݉)lzz2>24ba %#@$ ̏j'k";)Ub#'4}[+Ν38X\ iUyXNHM*JU@ ZZɭ#%޾N@em+8- ܞߜh:umy2*\j-WM$ʕ#6VxFJ[TP"-mHR/]J\ӸsÓ#?^L,:Y=@a8Wp|Bj;=F.B\ \& q&d< J%A+WfJ9J]qY/"1 O+I_'Sn4@[ƓN)PB*ݦ򖻽t B>C68t'~g)F Q?J[(!HW+ڬsw#$lWQ;PD ? @!GY`)%ފvp# ݗ^l0Øѳt3.;9-z-j1Q'LU '-hy{6cUwBqk7Tz;=#KX;:ĜYn'b6*^;2Ap0ܸH8G./]< mcx`zi. %+Iى!\#Bνqi rMr)T$0`⨨%,7 y0GxD-OL:=f7DA1tmvǍ?#:~? {o'txt<^;D{w9Tѽ|b'/z{^ϡ o޾|wG/ët_ !Wtԇh_xO_g)h|vz0o{ #Ҟ\^v2y?]v~D~{#?L/wjy޾L/❱`fKw˚WWw[^ `QnpZ\J_/˟ 0ƽ/^q[3?9<8fЕG,̆f{?WM'' ~}) wMgs=,}{)D>fc9W-^yvOG]lG ۢΟo._o'Á8|~@W ϱW /{._Ӆ @+| P}ׯZNM[p77mg_~M&#w7MYp|g:9;-ܳ0ŷ_c78+)U`q)N罟)>;~JP!'Akƿpxf\X1bt+q7ul%=\~Y}eW _oD/Gt/(ߌÝN}1B: ߣǣD"#K?EqmqvZܳ308=oW^OINn 0ψ~I(QKv ;R . )T鳠\M$+D\TQ,fQcλ/|/1Q\v8GqᨼP0 9Nxp{%%2D\9H) aOyi c<&J&gP #u_{`s)W$AV )<9kRm8Ckn4=8R(Fc6(&8LOgP @u]VZ&*KR(S<-Zf΀FCuX0T1uLi,vR xd$( ̒* fPbB,K8#wXz]տynA=m\%BB%d)%AFiЁ7Q 'H1xahvH3>*eЦ-L7!2qj) 6jm$+43>$=hl&H:3_0UE[X]БL}^:(HQ܊ ǒ:u,tM4i6@?,LupRgh=&&=d} . h#7Ϸu[]>~]20/xG޽]> iPm{8?t""F4^~_D2E#ʾ؂R JN_:jsu!⢈ fsu9 2;hA *k.HD>*BX)emh[֦bpi{0.t h/xڃ2J7%Pf:3Xj͗Mrl k& SHвpW\s-nRÌBfAxXb A"خXIiRCL\P`W! QQ:Z"aM9-KږI#b-y1ElLƌuH$2&) n9:fdKb\%v'<\qՙPƨPe DP`Z m z< %gbFJ Ug0&5z.i0x4X' p:.Š)خ.DA`C! ՠ(ˆ+ږ p6׫ aNGQC]VYr0u[JPTm?C-_r`V'ß%e$%mdXn0RΊr}J!j}BL,id%$v8\'^[hɐƺ~g 8b?z_So'W4;I9jj+T7˷#6bZYkIX\l= HIY,;T~w0Co!kS9>z኎>s]zpP&?No=},Vip?GHlyY+xk&*9TcQ81I&_1A1F#ڠ_wo&oG_WO@~{`M+9E=s7u8xo7П?{&k=]͌:M@I2c(e^JdM9܋Wg|CtE"*P5q>OsV=Fp^-loњ;uxu#+8Ry8}W~۔r<T J!8B4Tŋ:4Lf !~iB`LœYƱXQI Hݐzs|j՗>,[ gdrgC08 t4 z=ABIqX?x 23B%I472qZ|@}DB%EMT"2iUo{!SL8A0[E@} vqvs"Vٽ[EL]`4\hTArږ) lKr{v)bdDʹBYnR CRiF[\[YmXR.%5QbcaS&qζ-pJU,)CЍ }*>[ ֯"疯I 8K^}78in-?c`/N \Ew~)!f|t Șl鵮k%f>{M$ t>C6v6|b脼O p%[4\g.gogc(E*rc剱-U*Sž6OG!ZZVS6En)p Ԟ2!mׄQٶVb$4o&`2]ś u8궣q\Fkpݎ+ 6 ;8d 2%]8ZJxTk { V)N&+&Z)NR94#"I(jӔmg˸2ܕ|vi9aHN$LRi|` )ұіƌ '‰sTqRǙB.t)^Y@%H $ql#z \Ay$5Nǖ*9EC"11%Ȅ~L لLBRTҲ %*?LE@fT(QL0UVPBJt#V%J(J+W(]iHpSg'4&y %L**m9SB@fjU(YpIF۲6w a AalU ^<dNa 5Q'jϬ6Ny ،Z$ PH*ld"Eb9u o& HP:r(0|U55|9+[UG7{[u,o/f!Q=GQw2[FxZzVUJu@lܴ5baA* &|70藇6_d&lS= n<> : (mBJqUB̴¢?zάr&`";py%F~216g4%+iUM6 `&y }lïoQ58E!= ,2_W`>Ük0ah"CeRe!gQRV-*,LlYqm.FrEȵkRnp'WAB na)$_vd$z9jyyQ1.eұu ܼk'N1Zwm}$#?87D 3|uߦ3?<;A^ Ș*;f~5z㨓u<=+4a^y{H Smxs@Tu 1{N$4:&X.z@2yX<|0p |gع{+/!eai-v{A0 cӽ8Np59(4q:}찔 ޤ\ŸEvU;Fjēhv`\wcڊnx.إJ7=z"%| 3.RƆX5N:t؀ww213-bͰYM9qhd.<2cpsZ9 Jf-ޠ$]ƕl-ML/:XAFߛL@'lUr8R&5۳6Iכ`0oE?ê̅R~.5:I*ǰcT^>~^R;3o*[EvT>x6;n^c&Xj%Ԁ\Kv:ʆiFAT{Ik3Vʓv}*%BPd׀L?;f2+~QmP7rD x?I秱9.ƤX9 bm=Xʩ$9W>&gKtv6Tě7şt\ŃQX˩y!o C{ݢ}BEۓ7{mNY+ʋRUVv\0"-Ks4v5Jeqg@ݍFvC#@rg1@K"/0)2! L30̴1F~adZ*by,%fU9kg

V\>y\ I;SܙeTT|31D풬u>+:j0"}S1TB)̕,${H[{yI=fqDapǧ*^2yk|bfZH}|^=E9_>j(nZB*CګgL˧[Gp7gGTwKEGȥ9!h CVgMU'tWV5FlLqjb/4YlW'CfJ.١l|=L4Ái !4L-11KKPOR O!&Ĥy*>Bq鵢35w/X%D25c j,6eL2-%dD+I,y4w6`QV7`C\=NjraNs.E'Ǹ 9(γ*?Ȥ96Fݔz-2+ S"UDX!D 8vQCmuL$e,׈tDgqj,pZUg5Bo_wrW{]m$hvۛ2p>o8xE=O(>ymtmt)R9zP_lg/_﻽Bܑ>5v IM"Ud)X-KUpv~ *PO%Fu"E~X=DrN3T.ZtE%] NY(c*O1ֿL&hw׵>FQqnRybM",r6MR8:5pΧ#7i44F^q4ݗ!^R]^5 m7͋/~'Dj3q v]T_F)FXt,\ mΏZ:qZ:qZ:qZ:)jivڥm׿X/@_PBSjl4p]$K")L2 *ʀqX,<^I@6XXWaz$p6&(ABKǿxKcE"43K\RIb׺IRŽzKà׼趽jOY,F5YH#J(fHꎋhikЈD 3'UN ݑ6N ADFU}c Z9 s~Uax.\ n*1>N5³|uaD`zSMDŽG¯|ui(0]|u%Kz'pf4e:%Qgw\|z(\rY&n&: #@dO*w) 3v\gB;1G NNNNZLK 1ͳ<-pρ"|ʹYƹ s@B2 І⡍WVj\H\Nt0DS #lkRcCd(\ZBp~ H87\DW9YjyF =YOY1'lH>n߻?,>uysa2v ~W`g9GO fߋz}q5*,,^WW\etPes9e2Çҧc*oR7JTAqoK6s(C"]qWbfyĈz51R1h;Z+ߙ#%d/ƌr)%K!RhOLE{o6(yv2=F'Z8]Pg \i$b^\^8י; QQ|&-zAHi]۫F[SL=e!Hr8ꌀm5)p]0wdZ1D*Hb"#IJܤ6ͣ#m'˳XE`!uҮu)hqGha8[8?_nR}F4b 岍5C2BpՌgUHf0!HɚeX45iXJډZ I΀m{51G_}> 5ß==6N =Gcepgpgpgpir%Tg*:\$JƜj hD'U(f,`ӜhƥRDPoڰІhBƈ"IEJJÏ}B%KD,X@rv=Z]~pv3u9ws?hrgwqz^e,wmS{t'|sΓkpZ iEƵqJ./)1/m|;(jķy0~ÐXQ<5b&Nohӡ҈WE4z4Bl&2èK"h\R fte2fz_Ikm,`dA" ])[&2@Dן;;꧔Z&^@c6+o`S 2֨"pWW@bJm$QnT6MrwW\y4ş\;|LFf ӑ6-)4wʁ(*u| B@ݷP3HsFqw?k2TFBt9 8c7.4nƆ);\ڙԬb@R#!T(c0PKSicoz4Xuo5u!xMBT* t25'!CY(? -XZ!$} t!@+HSd&]eSzVR.. ONoYAf(=LG/p81T1RÐ DJ~'6N,yz~yrHAD;X MH;X.J=- a%U_Zr33NBPcNo'Wfo.R|Hc!w4&8iGӾ :pьr M`m F/:v0B7B!7 !d_ zH:d )aaF*+!FiH! eKSq?[UEX!1LL#ETVX1s)hɹ"Ob`L 2Z47y*e< m(6]e&D<@x*5vZpZ RĹ݉sv'E[TjeIʲ ?Ϲ!B49˴1TlnDW$EQHpMXԡ )uhՕ &ޡcVc֊DrZ‹3o2I$*0 +Nϸ `.s9 c'/P$AVL>9x( IhJd$N~eiF4O4e0< I;;y:Nono!ţqjGM|^ݧv&Z.8Qp/ "^]^{9$@ZIҪ&Ú챸y)Dd&Kݧ_'sDӚ\xLpuӣii Ï뷼YL1ػ/ ,yԿ{~F+N=`(藟Hq6_iaEً㾳S;ӕ><󮃇FP2|ݛ?k͈2j2tk1z2>o7$4zgᇣ{q{o..?ǏKu7w{풌ِASV:m 2m|J Gr R>*R_<vkmP/ ~hL;}.ɼj–\?kinM`%L{gp7~ z0}xs.2Z>!Y[5}4 f^ V6/ aYW(~o/ElsQo1ߪ) {%iL=v"o׹*-[}9SMʩfdZgsN&[& l=Y׃b9Y`ԫ'ȍZIfug*չk7=[Ҽ`tSlR$Z5nľ^?^2gmОyQL?:щ37,yy:( g t^}F׏QFLUek\?ԍ1[BO+:; ~vv A oº6_ؗXo^x)Hy~uTw%8R7a &;ή6Y,yYV2˒9]HzФī7ȩBFdM&IXPh)jT&VV¨aV`QɣtVԨxt_EHM GRلiW.TR8S8rLaR.+tAax>J9ěd8UJZ;T74H7 /)HR:׋\g ՙJ!%1=D'I3aiN4R)h"h^pыY]8n>nQ׆2sN!$1 ėڜy&e (x<zIl!Q%jI޵6n$"eqv_ !qMlceeː䙝 ?դlQ7 5X]_uuUwWz[͎ `% rQˎ']GkQ]8zţyz շA?bm>`vqKɺh& n@Wy>zͬL= Ëdv6߻7C=?'`.qn= ֧dUr_='76ťDJ[o l+!XtCqQn>KB OńQB6OLي"}7KO.leu3Xy4\:Rw̌z غ2(U )]vpIP 4o7&ȨT`Eqa"lKbqdLIT,91pQe#f[61i^r*\l)Xmp 7#f7sjB =⌲ÈG\Ē3 "* '$dj1QI:MR(50)+,Rȇg9߸O ,Ï&ڇ4ul[O>jk3y ǟhz|Z]\A|Z=^]^b"'WJjuy΅F+8~"]\$ҹ,q4^(YMQ/`?f{X?u&s}񬀵ŞTklb.Xb:5cY4Vk17D`j)3(RJ l&W`Q @#'JDa)bXc2aBQȰ-Ez;wf5Zz.IÔs r#dkoo\!]K󴚏r1.`h. _!㥘}y2~]ڛ]i%/ǧb@0z`3Ur|2 . ;$sU_fWa-_T83A )ewo}:-g6ؤY1IKs6+;1D4"D9V`iqO (bw{4PE3x#ӠO,P-na~t4!C ``Il2(Z?T1Hw d q,##0u9]9m(4y*M'sEalԾ/f-VW9A ƠJf ʟސx }r1ja * pR~1 {*O۱:iAG(T1>_r,w“х P#*xʋT^ _B`g}`bx$ҨԒhtmqاN\1kwa|]r= T+Psd" Qib-K4JK(,TXDc eBuۖ3}uA:uZ}$*&ܺhzZ $+pP19@Pj8aezoS*fb^dbu[3{+pk*֐5:V~>*X>vm]zhx_{$+h*sKh;zI(OumQu@;mqcEuv:P6_$_iNCkۆnG3aV iHZT%)kv#J4s½[pOZ?PqYXG˹gc-© ;u!}X>%L^Nĺ]<it]T&hk͕[F rXwvksJL-wVނ.̟nbRYQ-=\{0ekZ `GIVŰn`sݵg_̺2zDi[igIvųJRмvUkT ;u|ېjFUSâiNa7 iT9"|RuP ӘJ`&)Oh AI HLD]gYiwcwC8_Jv})֩"I$3fhdS"IPQ ,EWх ϞA'=Q{g=N{ P#9n"kWQ(TT>0I]Of|1.͎aEjT.w5gͷ:hʦܢ`rW`N\"u@ ,L;^h+܆Pؠ3ް_j>|vb 3o}:-b:ks+ F"fM")m"ɭ2qb1!p4<_x>f(\,`b!(j*#xdoc*{hu^eMW6hfZ zgDonXCˊ1hnj3^1.U8MM6HTD aRB(L"(ilhh#J` q<=]yTPp5oʔ5չkGS(g2y`bc 8#0 &61) BH!jB-8,vxzu%ja\,S@9CNGS@V,Z]43R/ݐϏ]l#-1U?^_1"15.,52?BF 8NOwffv|gVOOl&)JoC@NTы< KtKS0 *c1;<+TeVP`e&6d9 Zwa'"^3 p-\]%׵cۋeɇc6k TM 0,݃zMGX:U2Vđ+lᥳ}Nו|ˢ./v0Vƨh(LNWb(Iy;Mhv oX:Uq}T&Yw}Kc^[.\[kzͥpmK"w(t䥗F4TRJw`]bP aAP/MEÊ*M~1o/.TcJ)w/UƸ>*0Uj[zFSčINy~WԜg3({]дфsrf]K]he2(!=,FM(K"pzOfדRٳIY.EM<#BaQm!Sh! 8zzOɪ27z,OnA6Um`-T>2ޭu ٞw nuXȟD+&[+6UV1)I gN0zf_+ȴ-@#l/Dbg'}rɗG?f~zOV,w V".M~)63$/;l= NȐI Ý^Ps[HKɑ.I's1+~qyqTOydfck02ٙiԼ,]~~)=ȻDžI7+} |mWTCތa+tÒ`CSew?$+,7^,`%6a2FR#qFML3ǣ~>w~3eR0DvmQ9vV#E0S&},ݭ bCK{&ת|`k3:, A@bBY("؇cX!u$o;'+u\PtW6|Ĩ6Ectv>Pq/-E2Zt/5yjCQqUUh*yHɴW9u 7Çp^P[5_φJ<т5 #]9AMRtH`uUG }r E! 䣈K3 #Bό92#4-Xfu:NhuL>*_a"֡b11#< 9}b8RTA*W~L65V"e2ð @n?¡:1F0N( MD#eT 7jy:XToC@*XVXK _#Pbu ʹ@`2҄'1o"hBvcI)!8(u\;Vk̅u@M-OЎ&Z2σWBs$G$ S 4_, ҇Jv;$63mXKZA~mb0!gtlȆ;FJc%o̎ksP 6ުffhp5\ ZB]Ԣ)fq8l.0YdV @xZE,.dﺥ7\>)fp"Zh*љV6@, ێ)[_(s#CJގGJMD(i(Q 4BXih#xwn܀.+u%לu9 լuQ]#x5-;71q?~w!BvcY7d{0^3܇>o]YfjYe77=D?d󜒋Qg Lʵ?!f*,S$2QH?s]zx| y+Nqa:3]AW`< F{N~2\iA *שYdœUi16ϓC:N1}gݘT=ʟY?_:UR9z9݈9{KzqRK&ƭug:!=OHv^%mC(2hի^X_ǿDڀm N'za{ (I(wk) .9c_}`af›Ji h)ka1I^`'Y?̝Y?]yB6G( (4" ǁaU ET1͐!0*A (cƅ[w2K(u_t; G 1ze;m`s)9J̡f~ ^-UUe~o%].2qT{ʓ*CPf*cx"ual:re ׃%!L7?lxG7R5#V U7z9\kt!յ/ZPe)BUm"?ڶ,ywH5vG6魨Dpv+tĢ#Q )mܭ5gPWpH,UB[P m "H/`RqԖhBYB$j AjH:~ħqAdi[QwYJT_{q+ĩ^s9vuF|9iMJhܤ4: vM#h*؝bf/ҋ򄸏+ Fϩk. _Hp$X 0DF{r8N=tJA2!Ql<䱚(u6kYe]do3g 8$[7p yqGp$O6ס7rp;~&fl;XzeFuN`bp< &r_| 6[YY44|,vɶX'46`}^@R<];x1;C6;R^bni"d0i=Y)0t,yD`uT] ?d=Oso+pfkyAyAD^EzQ'-J0\?u5{ skMuUZ[X`QBnw;/oKV{ 'ޱ.f~s%nIM `$R~=fsp?,Գ ww2 ɖ%I5IFkGbЀs̲ܺme}A-P;yFt'G_S#OGmbn&4T a/g>vPso>/W;Gyk 9r2U{4ZwJ&T9}"J,i΀%] ,/_ dȠSk;A+*'|GQP,lYo?܏~HK|av]bCbNaea`Ol,P )udئEWˬꇁ۷%mm)O<<~xzx0mUwYsѝsګp ز[4- te1Tf|*ݦYHj$gKB4!UHkqWu&N6&*6^RiаUp$iHijIo^3G]o@BwXa`ORHVT4sU-(L~E2a01)JXk#Bq!"HbJ #2TGGE2@,&'Qd\{v&'؀+{02~jL!Ī,M uw4m(-]lU-6p=QImyz|7;7w厼i}WacR`fV]oSy20/\"jݣyU>"ћ?4 dvx.luۂk΍ m[x1E||li[g]v\B| k \^ex %,)bbϷb%O% %+vk1x%ULdl~aV#:4uUH6=t,c1ӰKA5Ҍ&x^PX<:ֶwn':PcƮ}٧VMl*'ָy%Zt̡rSZ:1UJJ!K^+a ]R9/}x%L݋˓*1Kuy:J5Ry%̐C0_]M:q;^[|\/_a"l)g;1tW º>ޫ7xqMF_ C0ٝK5jvC6 rׅU)~hI@+WEʕVWUk:ˁs/UE:hu_2̥lSgs䁽Ų-q fmN"Yw;٪Vt#ە8װ 0=[/d O>ΫpkeX_}m'giįޅ/V5Ư3SX_GwI;UoP*2K,#3Eu 6~ G:&'B3kq44B6(< XjFF+_(RrpٟFIj_/3;[f(M0}KCa3zsJ!/O% .{yv0 fe~=i:&Qx$~?X%@m-c{O;s?I1Yn`qXxK`m(oaDs.RHG4|xgKPanD q[7;Р|sxH xbunEy\.OSF ؒ'?47 .Yz#IoG`J6:i-:Y2O77 Z<"};f|Oѳm/mÄubϠlSx iwv>dӳ=5^< KٵXzHsm@E^ȈVAq~|%OIsĆ3.X +s6aB$2V`ܧͮ]Ӟba JЄFP@Z>d(y3TDLbx@4'(rLD3\|w(T$+ڮqs تe}`}!eFY|+*%4[ڭ}MkJJhp>\B>[Xf{ _%[puδRa:n7v5zw 7En?19 !NbhVSEQwsrTܭXuok2@a3prە1]F0]M1$ qoynh5 nY:3D1]*~ݧt*۾\h>yͅJnNͅJNO]tDKW)?{䶍_f2qlN֮8SR.Q_FR'lRMI )aJ9ݐx}ѣ4pEZ.}e]*' "DJjq* C[jڈ6-J9Ǵw@'VB켃}[qN>4 ue.ǡf+SM[Yc ̥`/62Vj11J " %rH 4=+aʀ_=F[9a0d,$_Zvb*dC;xhl £R3LI8ᶦQN!!m)^v Ɇ$1cshiH9$7oDO1O[iQ_y)wzCKrPuo=uQ\h]4CMu0-R7FwɎkޮ{M$Re {aΜѥgΗZHUUǺK)9N \.Sy\ϲl{%[Vŗް6?nm2i󴴽W oo M($/yz,XoF?=jϳdR )7]7`.O;]^vb SL%?MʢlKK3>Ec+uNn)Ǡb1:hg%%Ok_.P1CStoNZbA@&&;6_߾mjX\O|rn~(իqi|o /灢Df;e>1˹5QfLj._o R:W۽ox3avzY~+gnWG<O&[d_#_G)rY^RXeJ1=3dRIU@ȗ V!z&!P8#4ΐ#JAv9-^NyȅSh Q3Cg 'hsՑs~~ ?KH3.({)J9:q`#z-ASIOVtSPqD|7ݟ(btS W")^&ޡ0ڭ76XP]0ų#GuKrȯ^|t_L&h0 )2S- yPOu2'qFhOq 9a5 LHR6䟞F_"U'OQ N)_Ҍxo4L1ҧ6aa/+rgWÌa 2PBk$G*̫K鿔H9k7DQS;bhSDZ f"шj#c)ẁRSI*ISH2L|$DRH)S./Oe5U pj!R4ݺ:3D)@ ;6[_q:5Z'E$K։Lk30}Z*Rs >KLFT\OG1x8O!X:(RQF( ی2pea-BI*RI&;,PИ2~/l]ꑘN 0Z(O`]'EMomʭ[gkl?Ը/EcT݁5~0ֽn&F^0Y.϶ٵsr /W0 kкg/nXzc}ȕxwƗy\{mlܧ_#@Ƅe>[?S)~\_='!;FK5&)9,Ds`v%`* )U|Vq)*3W}PŅH8#;"?m_jv&^=ob^IY:T/7l}V0ni­,JK HYd:3Ni L%$KTH;J09w/xV; /_oKU{mX4ŕ=[^Ãwj t s󼉍PK3A(^C=.t z(C8TM!=#"ގmI;ZE *= z`G -Kaň'iiR,6IJ"[PF8_L!XT05@PRkxPΗZrLJOݝiO9lmG"kBs8cȻO* (L7+{؞@#:/A\b\7ۙ1߼rDx?[Ҭ7Ohg0}xw㽀"S(^1 $93y5ջ?skwR48rrgֈ@֏[`1Q%_]2~n׫9`Z[ͭ2-I-eV9KkٜInQk=f)R^?D[X_&H#qÉVa .]ء6(0^ F4o xb o"(k5sZ"i䐡LYjYw$Q㜥i0>$G8DŽgi v$F %)v:Q3(K_:GA18;#[zG T:b1t+>L %b%WIX5109>Ќp^t^P)ݖ˃ Bis9_o80IiKMCn+zCejrj!5i_EPN5J懻 lL3$ɻ: jAx3U*ܸUUӺE,?ܽ@/&,aJN|*8vݮ-w (?= L~;NH#UAk{_*Tp4qvs?dgD] ariP09,/KI# 1>Z`(0Dkv#,%P2i<όuI ^ [{ 8xfM]f{oa=1qxbux'\w!K'Ψ"I54 ×Xm̜S,Q.a_HHXԠ$3!hGt)&y-ŝHXd6UpYkm #l B1RFs>*M!X\Tڻ <jF*aʢ1DUvW5/y޿u7·\D%1K9t^Hm4qi跋zy;4Uޒd_=&3IVg:II<a67zag(0X?``'C2yK{mI%ٔnn zJCw&A B%bj=f0/.a}JաMѬq#%|Q~ݼXv (!35F9e{;Q,iыl@#h.9JȻVX3JX kK(((!CC~yxlr=Uͫ ܦb۔zດDz0Φ7CwpG;wv?sBk:NaxܗϐL'g}g@aH-:"H-գ{Zc>6Q.:vQR*9H#ɬ# UkF _ufIs V\ÇFD0H7.Ff(9}4 OOR z^YFpy48U4$= e$zPڙ'#HwwôKLw l AR]B#HKdo薔N_i.LwM6)! }wKx`i@m91be 87٬ŷL*ISH2LuIEt)8DJAMrj2,҃k0P۹0跋y9R'3;>,Pv`ϩriIC"іn@n'>/h5T"@\j Wdux&݁ZFӧۻ[1.w5?p/QAKPR1);Cf3KUG0QnyL#| -а]M Rn^AU&Eѯ?;)bhDMS-@?UYࣛ%xQ2 Z*kfQ"˝R]üFcf ]e~OF$A')0N)L9K99M4ϬMHفJoBTS 9uMBbSAzA0#99q;SzY7dDȮI!qPg咈P^rỈ:l~83'/E_Ko,K\KA-F٧sMD=/'—aˡB!Իy|@]K"T.;>A<Ҩ!JŜ۵orWQ uGT}8'G9%)mKb4aB#E][m:UEVShrՂXਧ.ޓ_vɺgpu;ӱKҶ3:]hΩût) %ht튘0AcItaB(m>iP !,:ӞHOQzּ 3S썊)\OZ)X˻P+@.xQ(v[ie2e4a+%NڌP­&Ԥg&Y&81H.aTYfo.Yk*^AxcpqXDb^# :Lՠ#32Nub!T$ėȢ$:tA1pB24ɈƱ3 9PA5k(xv\>wxCgc7FX|_/Dwۧ_xru 4Gt^fA?{6 ř9~ Y&'4pt.栗eJ$RdmGOHMz`kՆOCOy&?3_763 P)7fX@rFsRzϘ2VWc}CaRĐޛi[io{?c KNpw {;rO.TZZӜ{nIs ֈZ/D(ŭ[gXStnQiQJoCe,<ʂ4lFm Ye<ڽ9δ[wKψۼ`mbȑkiv`J2L+#6(c;T I#/ww$?PW׏ c,5L 9̾+lZ^Wd_.m4d˾sNg:\ `i$~ē,wO_b;i:yNВKk-y'#ZwHc+:@e9>(g|ґؑ˪T c_m'cXgrܚy &J6Mn5K7 ʹ(m聯i< q?kLD4x^#:"ڄޛUwm ]gPds̬Fj;-HtC1[fLbO0b DVHtJR )k[B V򖢦lfamb. W tJr mB.@,:,\-3VJr; 5N*;ǗŹݧXk!M؉Rqi0**eꏸOs}R) P0 )W> ""*"@HE'T%D '(EGWC4iRq=4<7De67h -S,~"I˻e"Nc4A|#>3/`|/ `e~؃wq?1Ϥ?gr,em@N@6S$d3T T#AH珥?taG*7fJ.M(H $MW1Vhao6OfW55ŭ) s?o`:]>Ue:->x@Aa١aE֫z5r˽j9@P&sW>x{pH)H4qK3Ϧ џ%`}YWo,ק{_nḛFBv[4~D3̂ T*"ms\ 36"gdg~I;_N{-^|o:]׳K#. J>F$@t|M:Kɳan `Peb  ftNǏ)!HF_,d[F<Ï@q1g4fR42n>-6S.z޷xl"3f~mRޚ8?FD[ |={xWd$~F` l= 嬀Wbn;!ZPRs`,?w,d pQoͼk/~iw% 7 L({w\&Y ;Kw! OHhP:sS3]џ`!@s|=%"leT /Fwf^/̓s!!IȾI|A-'0w¿L\o͢hKB*{Ng6MV򬝽^K0`>1{a57Jw:{M>]~Llw<{!Nh1Ofyʟ(ۑxg:5ɝs7 MTHV>[o$&C\c) iGIJv<{{ K!9B~yl2r5pRCWkVs&aȥdHtQ:]ݎnm4Ƒc4ÒIM8gO.5h;ĉM{wݎܽ+AjkW˼Օuj2 s.f85u:i&VspjEO8Qo_5})#q#k 9/Җkl>; s h&dk4].ٸG}07 )5|]z]If 0g4W3ՄQBOM g/lm]"/MG8vf8\{ټx!Q,(ֱ{˜k1WQhLE8 ܸi~ uϋ5Z΢x@OXs~+޹oOToŇl@ H4$F! IH_$a #_R!LH\H<He€$DŽiH0Lqp*l?j'˽cd6h C ϥ<Æ45^޿'w/r G.?Fḋ=#O?#mlbp i؃wq?1Ϥ?grƶ}fz:FRN$>M?caLy)z7>,/8}(C$ݛYIA2tG!S_qwۛԫH޵ pg|YGcڿVF=v|ͨYyT=( t1>Զ0Y??Rt1\ gO1b^ɛ\ nxƌϠXܛ<7h_׫YsKȆZkAXw׉hX_b-5rt#].ut1y'!gK[N,8 <ٖp}}~Hm' n'2Nsh%-awaذt}C: 9v|]Qw׶զL(妊L'KmTmƜ1CG;h:tAڦLݨB8H]@i|i }/Pj .3zxм#M<]\9Iji:'k } ؾRZNfXZfm3T,v{M=&)3&8i 9D0H6\\]K7:JnqY-[TeTbNuqR|ێRPsGVxgZ iuW*$Z6I'r6!h@S[]V! 6AECƦVa6<״ g͌vNVZ"ܺ@څD[=h ' tݏf*S܅/yIb+mchrE1_ˆ2 JTn׿@(y7g]E>Pl5y.27=d5Jݴ2Ւ:W2ɽkU"D2j'޸\jMepzW+I,G$y?O$j)}Y>*f(:V륗t&EAK|po],<4ksh0Wm }N?w0XB$7`><%p3+g4J=e{g:0G}̘U1T{ɆO*G2~͌|SM{֍糲thbPf*֭;_p*xk֭~֭ы,P896X2s qѳ{9,V6 {\F(|u *| 珦qg~:׏?e٪F,A3+; j&ʾsrOc|u9XqOsrk"ӗr9zNN"hH.Z"#MOE@{Yt=v+%{!$88*Tn䚯Ml/姯޵#Eb>!7;l6E}zL9~Ŗld)b?%y43V]UY*V"g{PҐngUS)h.&3P͠!:.Cn@bAlWw?lލ+6_F+ᨲTHXIm6$vighF<75c^VzuJmUODV|Sc}V zU[މ1Q3*3*ީ1 HLjUŭNm^d WX4:5fweDql;^8Q'GqzSc=/ͮ#ů:F'7`L 0sP[Ru]c2_1N~dUYz ԓb\0ik+wJ!B}<[EnjQ#xddZsksFRgΎ|k<^{C콂h];-KukK1ڟewvr DnK[ ӘC9_O4!j99&g${)t~~{[mNH'κQ#R.8rAjB 2/ξK1eOqnr";g)gFoΑi]Wxc{<dtdm8܁Vx$'~,Hz y,b3EHuSֺ1rnqsT)%8rPY*,չ6Kr4'ʲ,NN6@y8SITr wJp b3SyƵeD4psqkIU\fvq=( PjPHD/+(kAjq; eB⳹p= 52NނL']% MGs_;E g~>HO7 ᑲP J?MdG z)J\%4OksDC)Vςx:=J˸@8r4J> 8#L6CNV_W=\ͧa.uc[C\==K>?}|ݒ@jy]~xY1{/'n=Rv{&“Mg |őOdOiRSW揊h,&|>6Hۃw@P(L ovn>F1q*k#//DsCAѨ$('I>ob~m4a&dMS|s)5'*үtn+Ϣ$4'mV7 ,%Dogn:oD,0V=!<óoE#…❢%1@jKrwd<ʽ{pc Kq+N q[YooiDCt f`b8DvO'(^:Nn_YϿ~ϡʈ]O ÄTGapnzµ$˟#6V'yrhea[H*#{+EbRx3,:ofV̵qL$&1C3:ud;9OKyVVvn1.N n1pc~ps_.;9 %g 9oo-H{mpjȨ7TIλT{lYgj=mvxCѲƳP7JuyjK@jiȉ@S F29@D>ÏSjF4=SQ|sX+i ldPLE0lG o?߮rOCwӵ/z;Xu }diQu~7/ߐ)tGnUY׳b (eEUő*jŽs723yJT癓>#(q:] 8r(S|*D̬D.^QKl q3ْ2QHLr\I\HR+cՂɗz"eC.pTm*0sH!2794<˹fKzE΍.sTȝ0um g}9h_f{pt{q0[eLnOd6ӭ7S8]=F - n2kpvdZ‰h|\t-+#-%qy8d>ףre>ـs@)]sdSm?bɶy&K.;YNmw6n'G|BU*&`/sMl\0/ _ 13;gogjVG\j&k=j >2 ?\/hl8L: =~zB)ՋOE}O[<8@D\:D R.йaYhrʰ;]QK22,9@'UݜUY*Z  pz< Zh<&ǷOErmȎm$.lB 1jۋ (D_2`9\Hŋ":+VcoC~p7)f M 5yʄ &!k#RhzFիu{bFxzx17y&NЖ* W?ېQݓ==K|7ɩ{I!S~qUʗ_-jqTg6U 5ېRsY .K}ӗWlt=ŝOoLW̡YNajIC6~΃M|-Ŭj:@Q'- 洽w zPݣ|* 7\)z+- ux݀]0b޳V@Q]AGϝXTf ~wE Q[f΋_$WLCvg7v^&m>m$*JMw̠{=}ݮ*o4yk$ھ{>{In?zjIUC3?ÀMy%3oJۻx%J@ZTa@mv$W1= dyө"~ bΎw<@8bM4љ/TN(}<k7AuDs1WV#C* #8tLcrHmxO/s 15KG( ct4*hUР+Ijb9~DZ. cDT<7M=ɈHk cM_ԨXӗeahٮlNz1ǂ1T&2Rfb3ϩ#Q33oPGs\ԐsOYl&W(ccEX0fcP_`"1TCIBx+Ş$jZ!21"Dw.S@Sy*4sS5dWbԱ6mif1҄Ӆ9J6.8 ԝP6OPjl XT\2.J$F>h6uS(7eHGjZ sgJʠ H?- /`40dZsx"(dJrfqE ).O=h09e̻{_!I 956_=grV !_ض*؜ @zN1wŶ*Y^vE!C344%iNX z:e26ztn,󙓊QN+чM`Tjuq u'j>BZPAXUCvJ<}|.?<~|! Akf茶(Gn7/2_2W Gm flv:Ԝa[-RC/ƃabpdcm/JRwi]ePCPpg㨕GVt*_Ir hknPHL4C]&N:TOTN3 ƠCZJXB+`PЗBu‚lU_Y/c NAʼd6҃KiCBĐp΂F 5F-i#ylX5n-y2^d*J݆~z?_ ?]H3d8~wJb%qMu_)3ɍCsДK-[*d}PT'nQ. q:Ev'~? M&| ?"=l.>?2F߹?ף|tE/~]JSb~x`V~u" ʑ󃋷迋` 7ɀ*FpJH+T+^^tzT}|S-R Sa3jhzfT|'f;DnDm:J޾y]NS1w6jPPQy0T r!=_lP#CM~gb!2wM&0S]o#LފA/AQ޺bq[nû>Gvfnndr}.|w.n'S2$W;2،ww3xw.nВFV&@ZhѷqUuE@843S&^W3p'Ĉא+rv5 g_EGkW Z ֓_ 6FA{E;GY x='oP 2r%8GcFQI"B "ʴhT` H{ őz(ԄuGD0O?aŭ19J?T\W&1GBJjuEj4ifJ4s` z)ziiq4JY eFÈic}PxA TUn2/zp/7~1Kk2',If(ދi? Ce7WVI(N=I<|3$R?Ծ=7~>Ǝ/LجL\}0*LX1B[nu\rcƇ 6Uh"„UEA&/aӽsq;m 9ʉBK[ŀB5p>Dȩj)lnHOϻ $.\L1oznwՕ(TUӂTk0RlW VHa8[Jd [6$4lGuzBTȀ C$\\\5w\\Py4ݷI~9XO6~\)A/N *BlX?%L~kiG׷[^r̠KҤxhn, ;VU\Dt]E=˻Z|*rNߕ?Uw}mDd߰޵5l뿢arld~qUɤN( kq m,j6 d[ C-KݫK=v VMKe+J6];5.vVLZ(AT7ӌ.e2@)}6'bOœ2o齴*9p|7\N]p|љN܍sl'"LAtS)0IUWI'tF6_ނ;,a4g[*32{KjrfoNj,܄%E[8N9й۷xV@ : V(oZxw`.&-W+! _UudO&]9(hJ2BɎh6 ߴ:}K/BVTq =LQ)  C;0PS1#7߽et`lv`Ke,(+xIbTZ[PH$MJj?JS T0Y!JÕ(Zfv>ܸ ^LF,JMV#YH\-߱jiaIJ.u|% =ɩ9r|'ĜR* \߼UxQ7P`+r1› NBo8:VfM*u{qzDjuCCOGTDGkDF/`z*+]X0teJ`xr͊dI4Zإ+5焀 9zJb 1)T@1h'OS퀁麷dWc% dd<øz%8M< C[n̒MfΉbaMX~;m_ j`LUܰeݼ6Δ3)4;AŁ'7g.PQԺvZ[Dc~R6dJCKJ HHtk\fHx&_յ!Pŵbt˲. l*$Ǘ̋6V2 )6_4q[l6ojeH2u)l(aKb19\ݯv[ױzYbpptdKȨIJK"ʉlJzxdf(J0T]y3;`q&_W&O\ A)پb,l7cB '"1>SyzMNAnMFYu0f =nF_2PB7<V =*. + z̒_`>Cxw/Ak!1BB^rJɲdr2|?P͜ 0洙7'nF8UGV" .33ZԆLnAxx6m4SAgAb)xb%\B":2&v\?TpgV;ppٌ"~ԭ`K^JO\rӎEP vp`Io0_IV|̾Ѳo3[cS|ygN`P]# ay2{偣IunhNWB)X^aH*LK,F&7WDa}UaQ>MBBm f"@o+N] %}L8uyz~ܩD$йY\EK/0*yRyjG2y{ɚCG`!ڣ2 `ᒳD{=W(f\pG8g Ӛ2b;c/keM )ݡx74ni4Ӹ٬O#cpa*p "YH3#BYM;(kiGcz~he'ݻNĕ>:0]ع|Z+ĹAKZ߽^nu]O8| c_=tmuÁL% :( ظ6 ^mj慔``V'n)?>uoSxOhX2b'20ab}:xp{ʓ֠~xBe.Nz66p܏$4}>|Gar殞Pޝﭠa$s|;#[|fA9;5ʼnɎzkwjׯP68<R9jD'_K39 ?NFwvzauB>=k_g_<72CJ6soͷǗ4}|;j*}ڊs=gӋwn^|8>gq3K.XO}z|bX<>yx}4}8~h9=oPbK4=5W{xѭWZ-un2Qe;y[:m_A?g;MX%M9 Z׭sMZp%Ooɻwɵ[^C5,%BN:x>t=l 7CGߴV½' ܐz 5?ӎDIŝӍZFz4zb&7uS5~?`Sʕ9H?ЗHX'ȞXX hҦ$,~o:1g'HgBƻ^OD= G%w:a\lʳIH1uof68|\+9Uԧ7'qp)2idA8k{k߇o#g7DW jh`^7Ӽ* V:|z1?2 Uԁ!ʥ$%,ɯ:^ϓDʳ,i:,SL4s©<ǿ904|{ah4LZ+5=Fd_ߌ7jGX m-jpETkkO(Er-Ap1{6*$\T~!(@74@Lr&H+$vAC]#e>W!v)":dXY^ i*|95 Rb&: .Noj9AFt&+"!L{ R z4E^6 s~S= `&J2$lFYN// AwUWwUWb2$wUKsUY΍LZ ޽,QQk 3v?n3+=o0I^jxثELnB?C559Q{$5{3{mو~0ӟ$|QPZFSQ|A %[BŠ l !6ޕm8\VxLfI5{.XBIljG!xOgxk.gHэ%z[0Sx;ht%a[F7kJv+;x-SsI/BQuf/8T/*%.|SJI<":$(l/)5yw]X>0:큻BRJtrzFQExi]V}06aC},tUp%&dW D!GݥJUJOLdypjI~ލ\Mb%t'>6I2${pA +8G7G5"%sG959$/ 2RZ ZrWH g&>r=:;Eb{Y@Ǝ^BRA]Gs}WK_xuL }QAF\JudC] D%d2u5Nk=$DCXPL"qC]⒃#shs*.(Ir=jSɸ |C]iwQ`3) 3.`M\h4WaRիҞϳsm\;#L>[7ڭ(2[TֲVd|])d'#;0,ܖ}`M% njrNNY S=h+-ݴ)@T-z'2'?H\-G9dA᫃%5*`lyӋPj nWcf3uR[8Yi%xj*2(3Grsd2F4)\p!T{9E֔ZAٞcCCba|[%Z&s7Lm nKL2FN6N^)67pcob0lPedd~>SZi0X | A1)NPʎ6~.?S"ʫf.aeʛ,  WkH ߼ J& 4|q;(MeMT+? ̲,7ȝ*%C@@.QUh.D%ia̜Ҫ9+ɲ +wɻ &~.Rܲ6Eo,x'@ L{SSs  Pn3pmgK7A3ioygݣ74 T4\d[}8HQܵK* ':zh)NC!Us8f8C7$":ÞzQ+ymkM/N:ۖ&_Ξ-x,[sEl(nABN.S=[hu@3O TڤeOsA~kK~P0IwD*zƐqŚ[uڒ>-@B)\L DTV9sYttYeiaUx))Lp5BOl$ٙ9fS1EXYewyR>ɲm}F%'>J'O(نs%9HjEۙ @I rIs }bͅukrOI`h..6><}Cڭ2/Y_[>0<]6z57c*%~IAП# }lPjLwc,G`3ADJj0MWT+QZiBˠXO-8eڶw  Νl =B3E[2/?dYͤn8yx%T7qT[hRj*kP/;-x$`3`Tȍ'~,9|fIweyl߷~' |v'pB %RaR cg`;x,߬\fi=6?f=&koSuof>K7yĠ=Quup JwЦKIxpnf? p@ɮ\PЇ Bǵ:Fi# k9OuT-wD(LM-RcZZFZRZZ̨`էa]?2Qm}]u OkttI\ROZRk#LjsQ6ovYV"T5S[ '!GDN 0P%s;ŭ-8Ǫ70$s+F#9x)'TcwÚaɫxp$(ǂwv K15 |(]+)G$E#Vmv&-R.1)ΔEɅ-R.޺g2&6{hR"l#H8-&H߿яRc\|u<įcO63~6''I Fj+,L[K)wk~E[ JUZYa2"IZƌc]^jNW,OXA(mApF@ݏt1A#>㫠WA_x5A2"a=T:ӆ.YhZ68ŸFڦVSS-S]-%a6sΟ$arI'W,a XKKnsgzC2t_:H"&e!U*H៖bl~i"Q wa7Th&)%8N2A-3"1 ÿW_*,P ,t =PpÃ_4ޟ"4DʛO?pP3α<ak<ަmr~L4t<^J|4#5Vvu!mK\K0?w"ſ ]wU5%Wwٕ ][ `9d.8|}@7ή'N".fo=[/zD?kzwxCR{l 4w)y$|@>ȕM Fډ0hjc)͸$0KrL #Kʜ$YcGD$ĀAvwtAy-Fބ|x1In֭Lps`9%R~Jԇ< l-7`72PDp%{``Ōpzrɫ;Vg7?}Y*kUda , {B ߀:%Lo66LU2aP[0A@Սɗ Vu,Wo{~f ޓRMQa轝~.TiYi' hi6N A7okmBZ?Zє}r+pjV& c<8|;;54BC:yUPS5%逜ttTS\x䋜{‘u8`pḪbkTvHei9G`)gJ #ogW8UNr iӹr, +=;7Ǒ; P՜BZ2 eUj pY.lEK*DB9uh1Dk/g)ۘ!RK,[6=Lo2XvBD Fl9)m_[+A< c-^S= @qDѧ{D^vٸה*éAal,w)̫vj @IZcZ]ձ98iDxq~e[6es \hX>Kq(n\MY2DZԙQb_ RwC\6u{ЀSn )e J Baݐ MGy RpDjH"]0ݮ莻 5R'yʉ.VCe53m_˫go~ĚIsH^up2 o #yWV-/!5rK.y|w(o !<+9Lj,zfu K3-2lztQNߟc~Nc9eYnG1qV]r]J p:y/<~u o&ݫp$Ngd.FzwB組M ޾OсBTI\$@8Jtczsߢd alb?zr;~뜞ag1->w7Y/z?9x <@+ތ'nhFN'1ćw W2N}\Y?u/0Y0@ٻCzOoy_oi0[5v!C!w>,TAR!jh+bJ26DSE>3Eot'J-Y|E@yE+pcy3_>:6[:8ۂ7Fvv:߂w}4`$!j9C=bA˘gy띕HZ¿#w5f~3qsa~L> Gwk #*l {-;r!aĶMr 7`W؃ʋc9/FsU>-̠*)#8c!/HsPoSa@N[~ZPJ J9UQ8>Ju^>fY^yrcI %I91ڲknv:U!}Pyo fiϤؾD6L#jQ(ð96*cAyYU_)- (370k3羴NLK!٧f{4H1k*Nb̛Ĥ )߬D7 ]QB`*G>x~OWb?WL~v ?ML&T!g{*\ !l>H!>ZlIQ?u%_8⍡) 5/[T =  O o>D;^E !8- * j;8(ZvYjEqHC}~]Lur沱.jMh5p-]%oʏT<]yrϒ2D9/@* ]|Ҕj &~;_prY,Ѹ,+K̀e_{A_;6 aQSH |ai#wH@aNL_; „#t3a\N,sKdU.Us񜣯0^ڪ/:O C,Ļ 0.`Cޙ@;t_~JZjw)nlQalDV;1[($Y~Gu^hΑdr%D>"sFV)Tf-պSU2([տۅz'"׮Q9 Pdgw>߼pOxIs#q_ׯ}z1l l%*/~ӵ}(B/O'授>]\)6ݛl5fɹ=oabh~ۖ0xzryϟ..Yub۞W s'!@ ^Coc,Bnص$vx_VBdSSgsfzi7Ggm̍Uͥ$2jhubA@J̯lurO0ez=fg`J.Tg:~J^&c 04r^ê:DzD=0ʙ1t 悶.sAna_n#B]s>hGlv;q-Xn_,YMܮ=[G%cqJV 餔ԡ5GvbR-񧻵6֍5tD'ّ=%$}1ȁoN8\N&"ؕv[vz)W.)a n1UoHuN*}__Z7>}q굽|͌UVQ[E=!^]΂NiIRY'$ *&Bh}:7 z^_^;)g'^Yq<|bsNm,qxݼ?9K_*mN0R:Ԅ-bSLUiv~u14 2}?;KzܱLX94shXCЏd dB;\̊|8r{+8)e\:FUJkSUQ/USb.8=bqy^uwcxyqQoO)avۻ_oN拃]vf_'@a\_W/OYbPBfFRif󶔐R_Rs^n{җs߫!;L&j(9D^[:U $!xtݼji1SEY\c]xg Ζ/@{fOÐ;2嗽 `n8'[=lpӥzurX>ĩfUzGGsdani/?ۻu9G<}GQ[IQkmG-\*NwZ{}Zڨ[+YEK&{7ۭ plVbߍ k#բ=j݋Z4λwߏ}Cű5D\U}W鄮(c 6z-iݮe?jH]Պ}wqZ{knpҨO>؇76?#[kTjטuʠ_/Y㧷KHgzs9d&$.9"uo;?ܞUxUa_dw(R;#/׍Wv߾{|𿛂ahʐ[^j6 ~"'O< 4K׀}$~Gx0FgT($tCfvzbŐLE=Q̴74_vSӡTj OGtB=0h$w:D#y@Z̓w|{<\^\\:B*Wx7^v`7Őc8FQ@E ѿ12˜@f2[ٌ['"*m x/r X%86^e@))3>r.2)@YK)1s$`t$|Gl;Ы=K]RVmy`Z:`dJQIq)~ΦbFFKFDzջ]WlҚkg8d,WB^CFH^f  QGVy`/$֒`KU@\BIˬJ̵ M3`[Sgu0Oz`E]a;`t{ajfQ@l!,' ::S_ 1~T!C LG'Eh~]㒻6Y)B{YX(j6gyְ|?-킿j Zן">>+=/>aq fD@|O|4\(/~4^/hGIvb۝Pw P_=|N^t}*Ss5Pk]uiS:VRQ־*"|eV[MYUî8`3GɀU#r\<`UZ2dƂj"qB-ХՄ.ӡ 5x&꺒 KLI$]yIb{4{jQІ"3Oݒ*p1V_h *ϝZ XU2,ʘ!:J[c9CI\ɘL~ss>&9YSIFHI{Csǝ3Z x!rN:Xi~g;%[ XRW8~`&]1TSzpwk=2*$H]VX.׋ "~׬PwL68r"k8=l8 AM[RFKڵ`/F{Xj3^N0Ĵ$!ho `2i)pp|_ vO {d}'*U'?$£y[ 災Iu-jZFkG+B{ 6WS?{.j-zs{W!q(ys_ i P`>S!x1$XE燙x MO!&W],[}푬 f#Ֆ_0vV AzBYկ@m@[^td'QAeO&iN&JHU:qciB9>Q9-H[$V m#Znh*5xԘݼhpnMO6M-Ne{RqHjӇq֖uXTBr-R"J,x5i:Q#/Vo*Ad+~}xLA H,W$KiCdfrMݫ4wD0b$l)K(lBq);Q mCZ6+XzEq։<]I#ig!'{Ԅ#<6T+%0F']R o8)7X-wnM# 5a%|}ϪAM9=s c Xtь s/k^Z߿?E:(-tgCcfh@`?UЏaC? 6arǙ7V H AlcamQ`؁ku;wEG0E dB?: 1O8oy^Z4_~43s@ODTcH S0ZJINAPrbCȍ0*攡+()w9RJX/Fc'-SV`&srC#7B6G'qc<`|:vs$G`c%r/R#iƼa<,L&SY)8wpq3U$|Q+FiPcWYj"{yc Nw1p%>ΝBd"sРpp"`}@ %w@TA0dk&S@"+TB3$ 4*Z (\LJ?8BˀRs4If5HNUQ dV6Ҷ%>cezSTԛNQ^IXc&=uZzxs^n(L?\D'x ~.1N{Fpnt?=?+w,VAw3|'kPT V*O VM:3< |ai~ًގxxST(v Q/>S 3Q^G~؆_T R_FɼfIL7 jxU+eH ;U#fY#1lºK0XPv􊨒s^Pж5~7{vg!])>C#gpL F@Pư^Xm5\]-1!GN$%#|y'E"R  (X<a8 ]D"G!E.ذ%o5l(B0Q--'qcfc!- r͵ @)PP±#X` ^!FWjdXiv%*mH U.ies]$)D$MS][mTC)Mi X J`z\6[P'ZǖT aίE:7İC+(ОO Apk΋v Zר5J%"2r\"ѭL2/͂O{I{w" )p['Bxb~.B q3`lZxB裂U_t-f&ˋY.g_k.0nLTK}MeeaI1AReޱ*z וzNQMCFш#V5N_m m!Ja^y8Oq#{v"'HRZwY{ _"k8LdtI(Q"V"ira<B<ZeG!e)Nrc4>G&n3]ZH->(:B)r&͆"}ؓ$Ȍp'2&͌.SKK$H{BJKn[챟ޱgȾ \r?~?4EB+~J̔\NRgTassýsXB-y,7BD"Iax,V؂NYWECmڐH+ߎFhib4OhOaI>r p)3o"&\qV RENfg]$8V\RZz>x8tW8Gb4pYߞp\DQ @TXVB\Hz 82ES{1R!&ju VٺJHdLZ=Z{wu6W qydVt[^㧿~{|T0 @+(y>-E"(jgWaAg*V0&{tC7;T‡:ilƘp2\d(RlJ ՜BomRW]9-5/ F t𳆉SZBYCj{cqTHNtof+FPgi# ^鶩ש)_v6+uk$TdujhlCmc8_栧}KTG{);^<孪 ]U6O)6dZ+=E,G)>Da`1Owbړ\_?߯OddgN#ygo֢Wj kXqW%<\f,l" (|cMPɏwizwAsxYEeJ3.$(heRd|>c\lj{ߝxˌn҄(8̓ߣ`jq1@Kx5kea2Q-YoCI!yk;a۳#wܨtD9_ע=IΧ657 Q9oHSEgXzerhtDwvm9ޞ$_!;RGC@ÑSx=isBBqPK&, PN_e/'ׇZK)ֆ KȺG 9Ze7l`}"V]=6Pg.60ca}8j$wU~XZh۱6p ֞GFJ!g:%̂G}uomsL@8>oiP ǿL!Фb3{ߕǫ4m"oqz;(qv{E(lkmJ6q$SZ|ضH]nCiP#:CGb(N<5fkmx^vCB޸-SWKRΥtJl[lEj V9" TRl񊭧 RQӄRh)͎qkA>˳ιE^,hFsccY [1o=PYQ jJwp&MvRx7+s* ȣ2K2k̨ȣAc|U2AH2'}C{xNIens  dC@!]YEeY1LڠefRФ (^Ee:zf2"R,:g )r RvVp 3i;,4ͧIւZv9m GU}p4?eVcZ,Y,:kd}YޙS bJg(+qΪ_qs* w|kk]^,o_d "&f=W SAfQ߇_&n{?gw0gݥu{rs,7}>ߛ0Q7w'6\ssn h*ap@fK=̙ۀ[Ō3rG1RfOrRn"8 KQI>päL{3Y&pBp!-8-,HΒ0vAQFD,$e\+EFFXIIe)X"(Hd/DD0爛D۬t3IL~ E$~xn8[g7j;ghrpOg 5mWyq=-7qӳyŅsI2xH'%]`ʾ_-E"^`Ok/_Ag˲_FTa~$0N^O*Jlp3uV)%ZxvS}fU r Qv=KR*Qe(`7{wti%F }P>Xj{߿h/}s7tYnKd\|SHɲK^:lT!=!=r7^iUftt>wk8$67_O&kYn<צQ1b!<]"28zn(ϡٵ]|խV+_/ɗe#7^5azCpZs3 8U_Vb1 6R؂Ѿj/ SzǏK8R$Р7~ ʾV6RR`H2͆%3ϸ/%5;%5[YYi$lxdMP)ց7t ;je?,BM1%;d[8OJ5N~zTl\2 xJ&k4gMWHOyEX Lt?W3 &!ӁX dySE>[O$ mBBELH aRHOGJF:*#NOH>m͡\#YID9  no 1N4 Qp)NKzH<5J y`] >sOׄ8huJa",x{O7@hn?*6a#51! N`%lɟ]]}nHje$-7Y֐w8oV3wVVX tVgQB#3[f O]L?ZR ηd}"Vo3]=pWg.f2cm&8J~10ͭ>zI(|}^D(~>ڲ<GmOG{nrqgNFxY=xu)x0j!Ǜ&`=QRG (mu!3$`ZE/ʑus^Ҡ5Ӡ/Aa jCtӠTJn q+<f^Rrnl6|SOXT*7&fSA'+26<9̶w|: 1Z&3 1_K2B̡)(XN<%'IխR8% }?)y5ƌ(p"&e(dAt VFPp{M 7B৳SeU S#ym3(FvhS(ӇiplQѢĩpehJH%(( hNeQz (fт c1kc.+߮ɶrH=vSd2E c!5=GԌA 1i'}xAg-.?Entn]#A-e֟rqA W;!.B1ۙ aDw=F"!&8hBL顴w+GqeBR\@WIpy֎; ɭ'ϒ 1/OɟyF9+Uh,sK2{% ;zXuB<:zBqiA"&LipZDLc='ċ-H[*v8eka4(EG(ݟ1cN"[0sd)"Q ƙ5рMpy/): ihlbQʋƹ!\r:I1a@a7؛a09!FD)l%H8'F;'Q+LV&) <QH4B_|!*%K 34euC*41]<NGez1sB*;@욲l{3|'KPXa{~s4UNugG/)F$gt,Ғ՘4݅FJL25U(JyǪF fCrvl EgkΖy'x`o;!I j):H>Ji]\Wtie5T:)mR{ƍ_ P.p% .w7tzMR+lCR 7pN9q)UR NQ/ H$v+(M0MJyj2kr1ئ`"&|%d 9LQ(Lk5rV) =%iY)ŹJvTuaPHzH4C*762x=7$\#:UΖ8ػc[fqfl>׌.εm\ _j2ngdrN=k 3Nt~k\efO;OkҞѾnkw+B2䑮S7]Wڻ,3ST5DNrR|<]|IU\&^n3@ !Z+Vg1_P?[q,_h'31>ݕ'>a}ٌb-{*=E >{yp0TTkt& d6A-h׺UoxtZ~T3UZg\Z]Eڧפ-Rcwms ?\W٢nP5Yr>(f<[@Pu̜Mh_5Y!evVmF;Pu~yEw<6xtg||q4zͩ`/7WGe c=~"4>MgS[Dw&kcr~$/.d*)tS&Qh<hgd9 6wݺߐEj>$/.5djh)"9u09.H)[1yϵb:ԊQ]t,ïGrH*׏_]u용lL׏ϣ,@k۲'?As-uT;^+.L tzc/kH>lͻ7|~/|ף=L@\ۍ+w)ypyp(]XQ;K?Szp 5!)ꏈO(Z#9z@EA5Y^0xxk3ho?ohcIζ:Wo#Ě-H?|M&r^ މ{[_BZI~rK#Q5Ʒ0(뀵&ȝQyu|fI(~R@^d$J"ÉA'4*SɖwCy^f:ƳoD kE78)%L(8)%2yJ8)eǡKR ;,(<6R)QHsi$4ҋ|H(-w`:+4|QͤH),fX"外Lpe`jÃEEíQ_QSυROymKB⤔o-ݳ'5󝶔R'{OR*ԣӖRKץ.呺ԣu K)CqAi9@Ov6o@u IgTؚM9 MvXRoA!ߞ"19p~,g NMAn  FS">`<ϲ! %)sPX%`4y5$R "b^"u bDjLzsvsfzǩ7i77k~vPOpX`w V0]7;jSǵ׻D/>Ǒ1cW0{$Z>o|:uyg9& f<(CHT"c\f0\PB<Źȥ8zn,\XyU6AYPsMY!k)We_^5A;'~D> ~ʋYX,,| u"22w_,Wބ^j0q$L&W/V!rϣ;]ODDhj(|QNɻ;N0F$_GfCV/`'T LJnaNf\d<1BhEa/VA4)Df?:ĔPMzIZ3ns]cpOSUzWc,lȰBXAN6<2`Z ;Y.@sLoE.STcbpt+"f3*5\, R<9ܕj<֎ӻ҄ Iu"N#NI!Thbǒϟ4!Ofm6"2IS($vՄ':(`QW*I|]RQ2x si!/&pz!B}5fۆvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000004162537315157113616017716 0ustar rootrootMar 20 00:06:24 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 00:06:24 crc restorecon[4681]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:24 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 00:06:25 crc restorecon[4681]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 00:06:25 crc restorecon[4681]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 00:06:26 crc kubenswrapper[4867]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 00:06:26 crc kubenswrapper[4867]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 00:06:26 crc kubenswrapper[4867]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 00:06:26 crc kubenswrapper[4867]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 00:06:26 crc kubenswrapper[4867]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 00:06:26 crc kubenswrapper[4867]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.168174 4867 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171726 4867 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171743 4867 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171749 4867 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171754 4867 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171759 4867 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171764 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171771 4867 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171775 4867 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171780 4867 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171784 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171788 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171792 4867 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171796 4867 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171800 4867 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171805 4867 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171809 4867 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171813 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171818 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171822 4867 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171826 4867 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171830 4867 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171834 4867 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171839 4867 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171843 4867 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171847 4867 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171851 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171855 4867 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171860 4867 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171864 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171869 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171873 4867 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171877 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171881 4867 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171886 4867 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171890 4867 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171895 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171899 4867 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171906 4867 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171912 4867 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171918 4867 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171923 4867 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171929 4867 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171933 4867 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171940 4867 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171945 4867 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171949 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171954 4867 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171958 4867 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171963 4867 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171967 4867 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171971 4867 feature_gate.go:330] unrecognized feature gate: Example Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171977 4867 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171983 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171988 4867 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171992 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.171997 4867 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.172001 4867 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.172006 4867 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.172012 4867 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.172017 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.172022 4867 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.172026 4867 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.172031 4867 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.172035 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.172042 4867 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.172047 4867 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.172051 4867 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.172055 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.172060 4867 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.172064 4867 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.172069 4867 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172166 4867 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172176 4867 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172187 4867 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172194 4867 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172201 4867 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172206 4867 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172213 4867 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172222 4867 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172227 4867 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172233 4867 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172239 4867 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172245 4867 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172250 4867 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172255 4867 flags.go:64] FLAG: --cgroup-root="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172260 4867 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172266 4867 flags.go:64] FLAG: --client-ca-file="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172271 4867 flags.go:64] FLAG: --cloud-config="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172276 4867 flags.go:64] FLAG: --cloud-provider="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172281 4867 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172289 4867 flags.go:64] FLAG: --cluster-domain="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172294 4867 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172299 4867 flags.go:64] FLAG: --config-dir="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172304 4867 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172309 4867 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172316 4867 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172321 4867 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172326 4867 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172332 4867 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172337 4867 flags.go:64] FLAG: --contention-profiling="false" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172342 4867 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172347 4867 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172352 4867 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172357 4867 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172363 4867 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172370 4867 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172375 4867 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172380 4867 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172385 4867 flags.go:64] FLAG: --enable-server="true" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172389 4867 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172401 4867 flags.go:64] FLAG: --event-burst="100" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172406 4867 flags.go:64] FLAG: --event-qps="50" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172411 4867 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172417 4867 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172424 4867 flags.go:64] FLAG: --eviction-hard="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172430 4867 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172436 4867 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172441 4867 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172446 4867 flags.go:64] FLAG: --eviction-soft="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172451 4867 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172456 4867 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172462 4867 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172467 4867 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172472 4867 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172477 4867 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172482 4867 flags.go:64] FLAG: --feature-gates="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172507 4867 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172512 4867 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172518 4867 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172524 4867 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172529 4867 flags.go:64] FLAG: --healthz-port="10248" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172534 4867 flags.go:64] FLAG: --help="false" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172540 4867 flags.go:64] FLAG: --hostname-override="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172545 4867 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172551 4867 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172556 4867 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172561 4867 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172568 4867 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172572 4867 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172578 4867 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172583 4867 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172588 4867 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172593 4867 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172599 4867 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172604 4867 flags.go:64] FLAG: --kube-reserved="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172609 4867 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172614 4867 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172619 4867 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172624 4867 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172629 4867 flags.go:64] FLAG: --lock-file="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172635 4867 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172640 4867 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172646 4867 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172654 4867 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172659 4867 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172664 4867 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172669 4867 flags.go:64] FLAG: --logging-format="text" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172674 4867 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172680 4867 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172685 4867 flags.go:64] FLAG: --manifest-url="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172690 4867 flags.go:64] FLAG: --manifest-url-header="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172697 4867 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172702 4867 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172708 4867 flags.go:64] FLAG: --max-pods="110" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172714 4867 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172720 4867 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172725 4867 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172730 4867 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172736 4867 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172742 4867 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172747 4867 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172759 4867 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172764 4867 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172769 4867 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172774 4867 flags.go:64] FLAG: --pod-cidr="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172779 4867 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172787 4867 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172792 4867 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172797 4867 flags.go:64] FLAG: --pods-per-core="0" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172803 4867 flags.go:64] FLAG: --port="10250" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172808 4867 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172813 4867 flags.go:64] FLAG: --provider-id="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172818 4867 flags.go:64] FLAG: --qos-reserved="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172823 4867 flags.go:64] FLAG: --read-only-port="10255" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172829 4867 flags.go:64] FLAG: --register-node="true" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172834 4867 flags.go:64] FLAG: --register-schedulable="true" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172844 4867 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172853 4867 flags.go:64] FLAG: --registry-burst="10" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172858 4867 flags.go:64] FLAG: --registry-qps="5" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172864 4867 flags.go:64] FLAG: --reserved-cpus="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172868 4867 flags.go:64] FLAG: --reserved-memory="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172875 4867 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172880 4867 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172885 4867 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172890 4867 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172895 4867 flags.go:64] FLAG: --runonce="false" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172900 4867 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172907 4867 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172912 4867 flags.go:64] FLAG: --seccomp-default="false" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172918 4867 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172923 4867 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172928 4867 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172933 4867 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172939 4867 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172945 4867 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172950 4867 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172955 4867 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172961 4867 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172966 4867 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172972 4867 flags.go:64] FLAG: --system-cgroups="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172977 4867 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172985 4867 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172990 4867 flags.go:64] FLAG: --tls-cert-file="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.172995 4867 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.173004 4867 flags.go:64] FLAG: --tls-min-version="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.173009 4867 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.173014 4867 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.173019 4867 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.173025 4867 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.173030 4867 flags.go:64] FLAG: --v="2" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.173037 4867 flags.go:64] FLAG: --version="false" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.173043 4867 flags.go:64] FLAG: --vmodule="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.173050 4867 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.173056 4867 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173189 4867 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173196 4867 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173201 4867 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173207 4867 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173213 4867 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173218 4867 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173223 4867 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173228 4867 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173233 4867 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173238 4867 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173243 4867 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173250 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173255 4867 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173260 4867 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173264 4867 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173268 4867 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173273 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173277 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173283 4867 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173288 4867 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173293 4867 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173297 4867 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173301 4867 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173306 4867 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173310 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173314 4867 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173318 4867 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173323 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173327 4867 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173331 4867 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173335 4867 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173339 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173343 4867 feature_gate.go:330] unrecognized feature gate: Example Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173348 4867 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173353 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173358 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173362 4867 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173366 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173371 4867 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173375 4867 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173379 4867 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173383 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173388 4867 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173392 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173396 4867 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173401 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173405 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173409 4867 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173414 4867 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173418 4867 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173422 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173428 4867 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173433 4867 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173437 4867 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173442 4867 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173446 4867 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173452 4867 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173458 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173463 4867 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173468 4867 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173473 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173478 4867 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173484 4867 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173503 4867 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173508 4867 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173512 4867 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173517 4867 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173521 4867 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173526 4867 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173531 4867 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.173537 4867 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.173558 4867 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.181913 4867 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.181964 4867 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182083 4867 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182100 4867 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182110 4867 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182121 4867 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182129 4867 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182138 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182146 4867 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182155 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182164 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182172 4867 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182181 4867 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182189 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182197 4867 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182205 4867 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182213 4867 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182221 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182228 4867 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182236 4867 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182244 4867 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182252 4867 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182259 4867 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182267 4867 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182274 4867 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182282 4867 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182290 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182298 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182305 4867 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182313 4867 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182321 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182329 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182337 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182345 4867 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182353 4867 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182361 4867 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182371 4867 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182379 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182387 4867 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182395 4867 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182402 4867 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182411 4867 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182418 4867 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182426 4867 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182433 4867 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182441 4867 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182449 4867 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182457 4867 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182464 4867 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182472 4867 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182480 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182487 4867 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182531 4867 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182541 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182551 4867 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182559 4867 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182567 4867 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182575 4867 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182583 4867 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182591 4867 feature_gate.go:330] unrecognized feature gate: Example Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182601 4867 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182612 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182622 4867 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182631 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182641 4867 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182651 4867 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182660 4867 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182668 4867 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182677 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182685 4867 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182694 4867 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182703 4867 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182715 4867 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.182729 4867 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182948 4867 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182963 4867 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182972 4867 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182981 4867 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182989 4867 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.182996 4867 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183004 4867 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183012 4867 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183020 4867 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183030 4867 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183038 4867 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183045 4867 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183053 4867 feature_gate.go:330] unrecognized feature gate: Example Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183062 4867 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183072 4867 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183081 4867 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183089 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183096 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183104 4867 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183112 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183120 4867 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183127 4867 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183135 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183143 4867 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183152 4867 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183159 4867 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183167 4867 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183175 4867 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183182 4867 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183191 4867 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183198 4867 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183206 4867 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183214 4867 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183221 4867 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183230 4867 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183238 4867 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183246 4867 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183253 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183261 4867 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183268 4867 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183277 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183285 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183293 4867 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183300 4867 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183308 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183316 4867 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183324 4867 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183332 4867 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183340 4867 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183347 4867 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183358 4867 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183368 4867 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183376 4867 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183385 4867 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183396 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183405 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183413 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183422 4867 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183430 4867 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183437 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183445 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183454 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183463 4867 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183471 4867 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183478 4867 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183486 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183526 4867 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183540 4867 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183550 4867 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183558 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.183568 4867 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.183581 4867 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.183841 4867 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 00:06:26 crc kubenswrapper[4867]: E0320 00:06:26.188649 4867 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.191669 4867 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.191766 4867 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.193538 4867 server.go:997] "Starting client certificate rotation" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.193560 4867 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.193769 4867 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.220740 4867 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 00:06:26 crc kubenswrapper[4867]: E0320 00:06:26.221935 4867 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.225295 4867 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.243845 4867 log.go:25] "Validated CRI v1 runtime API" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.283960 4867 log.go:25] "Validated CRI v1 image API" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.285785 4867 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.290627 4867 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-00-01-05-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.290680 4867 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.315172 4867 manager.go:217] Machine: {Timestamp:2026-03-20 00:06:26.312134294 +0000 UTC m=+0.538671831 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:d62f574b-dd16-438d-b253-459ad966267c BootID:4229a945-e3d9-463c-a5d7-4185d2687bef Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:a9:3c:45 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:a9:3c:45 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c8:45:66 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:63:0c:1b Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:74:d5:8a Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:48:dd:d6 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:46:e0:04:e4:ce:61 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:26:8c:e6:77:c7:e6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.315410 4867 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.315625 4867 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.315901 4867 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.316072 4867 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.316107 4867 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.316375 4867 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.316386 4867 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.317016 4867 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.317037 4867 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.317484 4867 state_mem.go:36] "Initialized new in-memory state store" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.317572 4867 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.321126 4867 kubelet.go:418] "Attempting to sync node with API server" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.321146 4867 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.321171 4867 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.321182 4867 kubelet.go:324] "Adding apiserver pod source" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.321191 4867 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.325053 4867 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.326048 4867 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.326879 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 20 00:06:26 crc kubenswrapper[4867]: E0320 00:06:26.326942 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.327082 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 20 00:06:26 crc kubenswrapper[4867]: E0320 00:06:26.327189 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.328616 4867 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.330501 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.330527 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.330537 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.330546 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.330559 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.330568 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.330578 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.330592 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.330621 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.330629 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.330655 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.330664 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.331537 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.332107 4867 server.go:1280] "Started kubelet" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.332189 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.332514 4867 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.333209 4867 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.332462 4867 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 00:06:26 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.349946 4867 server.go:460] "Adding debug handlers to kubelet server" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.350288 4867 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.350329 4867 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.351035 4867 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.351080 4867 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 00:06:26 crc kubenswrapper[4867]: E0320 00:06:26.351213 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.351236 4867 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 00:06:26 crc kubenswrapper[4867]: E0320 00:06:26.351802 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="200ms" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.352641 4867 factory.go:55] Registering systemd factory Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.352677 4867 factory.go:221] Registration of the systemd container factory successfully Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.352715 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 20 00:06:26 crc kubenswrapper[4867]: E0320 00:06:26.352803 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.353691 4867 factory.go:153] Registering CRI-O factory Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.353716 4867 factory.go:221] Registration of the crio container factory successfully Mar 20 00:06:26 crc kubenswrapper[4867]: E0320 00:06:26.351825 4867 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.230:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e63f0273b77bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.332063676 +0000 UTC m=+0.558601203,LastTimestamp:2026-03-20 00:06:26.332063676 +0000 UTC m=+0.558601203,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.353782 4867 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.353831 4867 factory.go:103] Registering Raw factory Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.353846 4867 manager.go:1196] Started watching for new ooms in manager Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.356660 4867 manager.go:319] Starting recovery of all containers Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367182 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367342 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367393 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367421 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367445 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367475 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367540 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367567 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367600 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367637 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367667 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367693 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367717 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367747 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367785 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367811 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367841 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367869 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367897 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367925 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367951 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.367998 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368034 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368060 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368152 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368195 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368300 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368356 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368382 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368410 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368437 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368462 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368521 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368548 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368576 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368611 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368639 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368665 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368691 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368719 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368745 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368784 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368811 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368841 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368878 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368911 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368960 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.368999 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.369038 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.369066 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.369111 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.369138 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.369176 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.369205 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.369233 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.369261 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.369298 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.369327 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.369354 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.369380 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.369418 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.369446 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.369559 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.369593 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.369620 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370139 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370170 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370198 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370224 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370249 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370272 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370298 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370330 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370358 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370423 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370450 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370474 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370528 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370552 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370578 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370609 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370635 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370661 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370685 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370712 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370738 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370761 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370786 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370812 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370837 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370863 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370887 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370922 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370944 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370964 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.370986 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371012 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371039 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371064 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371099 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371135 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371168 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371193 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371218 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371254 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371283 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371309 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371345 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371372 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371413 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371455 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371529 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371562 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371598 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371626 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371652 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371677 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371728 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371766 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371808 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371844 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371888 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371916 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371944 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371973 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.371998 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372023 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372047 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372078 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372104 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372127 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372153 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372177 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372217 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372243 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372267 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372290 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372315 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372340 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372369 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372391 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372418 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372441 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372477 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372549 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372575 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372602 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372627 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372663 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372690 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372716 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372742 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372769 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372793 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372820 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372845 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372871 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372909 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372934 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372961 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.372990 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373016 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373041 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373066 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373089 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373121 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373150 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373178 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373204 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373230 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373257 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373281 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373308 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373333 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373343 4867 manager.go:324] Recovery completed Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373356 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373386 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373413 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373437 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373462 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373516 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373547 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373570 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373596 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373621 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373669 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373693 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373717 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373746 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373775 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373802 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373827 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373852 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373892 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373917 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373943 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.373967 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.376233 4867 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.376285 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.376317 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.376346 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.376373 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.376400 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.376425 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.376452 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.376480 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.377661 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.377728 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.377787 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.377815 4867 reconstruct.go:97] "Volume reconstruction finished" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.377832 4867 reconciler.go:26] "Reconciler: start to sync state" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.388063 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.389590 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.389693 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.389711 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.390556 4867 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.390578 4867 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.390604 4867 state_mem.go:36] "Initialized new in-memory state store" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.401195 4867 policy_none.go:49] "None policy: Start" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.402356 4867 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.402553 4867 state_mem.go:35] "Initializing new in-memory state store" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.416638 4867 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.419373 4867 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.420264 4867 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.420304 4867 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 00:06:26 crc kubenswrapper[4867]: E0320 00:06:26.420449 4867 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.423977 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 20 00:06:26 crc kubenswrapper[4867]: E0320 00:06:26.424050 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.444793 4867 manager.go:334] "Starting Device Plugin manager" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.444849 4867 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.444864 4867 server.go:79] "Starting device plugin registration server" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.445364 4867 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.445386 4867 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.445619 4867 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.445712 4867 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.445722 4867 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 00:06:26 crc kubenswrapper[4867]: E0320 00:06:26.453342 4867 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.520578 4867 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.520676 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.521553 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.521631 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.521646 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.521866 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.522067 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.522124 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.522935 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.522967 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.522976 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.523076 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.523180 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.523203 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.523216 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.523517 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.523558 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.523940 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.523972 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.523984 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.524113 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.524372 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.524403 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.524831 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.524862 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.524870 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.524954 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.525025 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.525048 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.525062 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.525256 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.525292 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.525303 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.525315 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.525336 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.525902 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.525927 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.525939 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.525950 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.525976 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.525988 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.526126 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.526155 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.526924 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.526953 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.526971 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.547603 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.548478 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.548609 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.548694 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.548802 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 00:06:26 crc kubenswrapper[4867]: E0320 00:06:26.549527 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Mar 20 00:06:26 crc kubenswrapper[4867]: E0320 00:06:26.552430 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="400ms" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.580667 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.580979 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.581091 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.581125 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.581156 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.581199 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.581253 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.581284 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.581313 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.581343 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.581372 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.581402 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.581444 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.581485 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.581596 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.682791 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.682848 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.682877 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.682900 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.682919 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.682938 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.682954 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.682970 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.682987 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683004 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683022 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683039 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683057 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683113 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683129 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683118 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683160 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683172 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683160 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683253 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683270 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683321 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683325 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683354 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683315 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683356 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683380 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683516 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683529 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.683535 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.750210 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.752221 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.752274 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.752288 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.752320 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 00:06:26 crc kubenswrapper[4867]: E0320 00:06:26.752909 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.841562 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.847527 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.860811 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.879127 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: I0320 00:06:26.883571 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.903451 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-3485108a2d58a5703918c351fee306d324c834506d7099267a303600b7a4b5bf WatchSource:0}: Error finding container 3485108a2d58a5703918c351fee306d324c834506d7099267a303600b7a4b5bf: Status 404 returned error can't find the container with id 3485108a2d58a5703918c351fee306d324c834506d7099267a303600b7a4b5bf Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.903821 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-5bb4f0128670b69377b7da1c237d59997b68654daca2751fdc165406b92bfac2 WatchSource:0}: Error finding container 5bb4f0128670b69377b7da1c237d59997b68654daca2751fdc165406b92bfac2: Status 404 returned error can't find the container with id 5bb4f0128670b69377b7da1c237d59997b68654daca2751fdc165406b92bfac2 Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.905132 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-d9c3e82947ea743062ac1e7570016d2859e04e9f80af90b9abb5c482f6672fbb WatchSource:0}: Error finding container d9c3e82947ea743062ac1e7570016d2859e04e9f80af90b9abb5c482f6672fbb: Status 404 returned error can't find the container with id d9c3e82947ea743062ac1e7570016d2859e04e9f80af90b9abb5c482f6672fbb Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.907075 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-b99c679d653fd73a5e147a08b5e63f25928978cc560ec0c56b8038042dbc5b18 WatchSource:0}: Error finding container b99c679d653fd73a5e147a08b5e63f25928978cc560ec0c56b8038042dbc5b18: Status 404 returned error can't find the container with id b99c679d653fd73a5e147a08b5e63f25928978cc560ec0c56b8038042dbc5b18 Mar 20 00:06:26 crc kubenswrapper[4867]: W0320 00:06:26.917034 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-4eaae9ead5e415d718015b846fe11caf0dfcb08e1adc8bdf026270a10a0045d9 WatchSource:0}: Error finding container 4eaae9ead5e415d718015b846fe11caf0dfcb08e1adc8bdf026270a10a0045d9: Status 404 returned error can't find the container with id 4eaae9ead5e415d718015b846fe11caf0dfcb08e1adc8bdf026270a10a0045d9 Mar 20 00:06:26 crc kubenswrapper[4867]: E0320 00:06:26.954261 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="800ms" Mar 20 00:06:27 crc kubenswrapper[4867]: I0320 00:06:27.153341 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:27 crc kubenswrapper[4867]: I0320 00:06:27.154608 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:27 crc kubenswrapper[4867]: I0320 00:06:27.154649 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:27 crc kubenswrapper[4867]: I0320 00:06:27.154661 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:27 crc kubenswrapper[4867]: I0320 00:06:27.154686 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 00:06:27 crc kubenswrapper[4867]: E0320 00:06:27.155234 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Mar 20 00:06:27 crc kubenswrapper[4867]: I0320 00:06:27.332853 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 20 00:06:27 crc kubenswrapper[4867]: W0320 00:06:27.375303 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 20 00:06:27 crc kubenswrapper[4867]: E0320 00:06:27.375436 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 20 00:06:27 crc kubenswrapper[4867]: W0320 00:06:27.409608 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 20 00:06:27 crc kubenswrapper[4867]: E0320 00:06:27.409731 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 20 00:06:27 crc kubenswrapper[4867]: I0320 00:06:27.426666 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b99c679d653fd73a5e147a08b5e63f25928978cc560ec0c56b8038042dbc5b18"} Mar 20 00:06:27 crc kubenswrapper[4867]: I0320 00:06:27.428050 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d9c3e82947ea743062ac1e7570016d2859e04e9f80af90b9abb5c482f6672fbb"} Mar 20 00:06:27 crc kubenswrapper[4867]: I0320 00:06:27.429284 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4eaae9ead5e415d718015b846fe11caf0dfcb08e1adc8bdf026270a10a0045d9"} Mar 20 00:06:27 crc kubenswrapper[4867]: I0320 00:06:27.430382 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3485108a2d58a5703918c351fee306d324c834506d7099267a303600b7a4b5bf"} Mar 20 00:06:27 crc kubenswrapper[4867]: I0320 00:06:27.432188 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5bb4f0128670b69377b7da1c237d59997b68654daca2751fdc165406b92bfac2"} Mar 20 00:06:27 crc kubenswrapper[4867]: W0320 00:06:27.574681 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 20 00:06:27 crc kubenswrapper[4867]: E0320 00:06:27.574763 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 20 00:06:27 crc kubenswrapper[4867]: E0320 00:06:27.755626 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="1.6s" Mar 20 00:06:27 crc kubenswrapper[4867]: W0320 00:06:27.920133 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 20 00:06:27 crc kubenswrapper[4867]: E0320 00:06:27.920201 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 20 00:06:27 crc kubenswrapper[4867]: I0320 00:06:27.955811 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:27 crc kubenswrapper[4867]: I0320 00:06:27.958214 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:27 crc kubenswrapper[4867]: I0320 00:06:27.958248 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:27 crc kubenswrapper[4867]: I0320 00:06:27.958286 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:27 crc kubenswrapper[4867]: I0320 00:06:27.958313 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 00:06:27 crc kubenswrapper[4867]: E0320 00:06:27.958678 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.276518 4867 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 00:06:28 crc kubenswrapper[4867]: E0320 00:06:28.278098 4867 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.333664 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.437058 4867 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551" exitCode=0 Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.437151 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551"} Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.437187 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.438003 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.438035 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.438047 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.439655 4867 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="71047bdfc9aeda57b4816fead48322cf357e88d97a9ee4303b936bebaad38026" exitCode=0 Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.439701 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"71047bdfc9aeda57b4816fead48322cf357e88d97a9ee4303b936bebaad38026"} Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.439834 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.440838 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.440875 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.440891 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.441028 4867 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6" exitCode=0 Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.441116 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.441228 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6"} Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.442575 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.442697 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.443073 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.444822 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"45cb0851df8f9363d5a2c9cc89573235d31c0a9f0773a8531255cfb8381ef24f"} Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.444954 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0e58b6faf3a88e38a3ea415a8e7104b72a12234d5ed92a7ff9dbd7ae0067d697"} Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.445138 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5fbafffb1e4506f28b3b8343e36bff5d861957c94a7cbd766f3a63969f5d3b54"} Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.445232 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c67b235ca7dadeca35706eb2eb341dc56002a0b2bd99586116914ef6d6e3ee7f"} Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.445002 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.447269 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.447296 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.447308 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.448630 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59" exitCode=0 Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.448710 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59"} Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.448724 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.451314 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.451344 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.451353 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.454262 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.455063 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.455080 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:28 crc kubenswrapper[4867]: I0320 00:06:28.455088 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.198536 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.206366 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:06:29 crc kubenswrapper[4867]: W0320 00:06:29.325775 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 20 00:06:29 crc kubenswrapper[4867]: E0320 00:06:29.325855 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.333720 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 20 00:06:29 crc kubenswrapper[4867]: E0320 00:06:29.356605 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="3.2s" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.455224 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9e5b9624aaef5b4563371e16f176d45ac03c847e262f8a4775493a3e69313796"} Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.455264 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"76e241ad3f91b9521298faebb9a60d29fd246e4589999ab7ab5395a3ef13c15c"} Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.455276 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e979da43d57b624616e0ebca87477cf565c87239ad02d1f0880a86c79bca523b"} Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.455360 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.456378 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.456404 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.456414 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.459457 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e"} Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.459478 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb"} Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.459508 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606"} Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.459516 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3"} Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.461468 4867 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2" exitCode=0 Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.461522 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2"} Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.461602 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.462284 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.462309 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.462318 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.464760 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.464979 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.465075 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"abe3cdc0404a20b6a3a2483c36b65adabadcafd16f543494fabaee167f674d23"} Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.465118 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.465434 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.465454 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.465463 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.466770 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.466798 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.466825 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:29 crc kubenswrapper[4867]: W0320 00:06:29.480100 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 20 00:06:29 crc kubenswrapper[4867]: E0320 00:06:29.480243 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.588408 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.589635 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.589670 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.589680 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:29 crc kubenswrapper[4867]: I0320 00:06:29.589702 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 00:06:29 crc kubenswrapper[4867]: E0320 00:06:29.590228 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.469555 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4f5730a3cf507c8afd6e631fe711a050a687bb0659f5257aed2139d84b833264"} Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.469910 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.479186 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.479218 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.479229 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.481349 4867 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765" exitCode=0 Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.481469 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.481521 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.481549 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.481607 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.481463 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765"} Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.482206 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.482684 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.482707 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.482717 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.482782 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.482818 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.482840 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.483054 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.483080 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.483092 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.483197 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.483230 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:30 crc kubenswrapper[4867]: I0320 00:06:30.483249 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.489010 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87"} Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.489086 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.489091 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705"} Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.489119 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6"} Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.489131 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.489138 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e"} Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.489349 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.489386 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b"} Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.489940 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.489984 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.489995 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.490320 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.490386 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.490402 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.749260 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.749605 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.751374 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.751429 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:31 crc kubenswrapper[4867]: I0320 00:06:31.751440 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.356662 4867 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.491766 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.491770 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.492940 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.492967 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.492975 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.493003 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.493039 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.493048 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.663262 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.726233 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.726456 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.728325 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.728368 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.728377 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.791147 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.792891 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.792923 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.792931 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:32 crc kubenswrapper[4867]: I0320 00:06:32.792953 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 00:06:33 crc kubenswrapper[4867]: I0320 00:06:33.310799 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:06:33 crc kubenswrapper[4867]: I0320 00:06:33.311044 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:33 crc kubenswrapper[4867]: I0320 00:06:33.313003 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:33 crc kubenswrapper[4867]: I0320 00:06:33.313067 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:33 crc kubenswrapper[4867]: I0320 00:06:33.313085 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:33 crc kubenswrapper[4867]: I0320 00:06:33.493993 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:33 crc kubenswrapper[4867]: I0320 00:06:33.495213 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:33 crc kubenswrapper[4867]: I0320 00:06:33.495252 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:33 crc kubenswrapper[4867]: I0320 00:06:33.495265 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:33 crc kubenswrapper[4867]: I0320 00:06:33.612850 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 00:06:33 crc kubenswrapper[4867]: I0320 00:06:33.613068 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:33 crc kubenswrapper[4867]: I0320 00:06:33.614411 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:33 crc kubenswrapper[4867]: I0320 00:06:33.614579 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:33 crc kubenswrapper[4867]: I0320 00:06:33.614623 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:33 crc kubenswrapper[4867]: I0320 00:06:33.867087 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:06:34 crc kubenswrapper[4867]: I0320 00:06:34.496778 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:34 crc kubenswrapper[4867]: I0320 00:06:34.497733 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:34 crc kubenswrapper[4867]: I0320 00:06:34.497834 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:34 crc kubenswrapper[4867]: I0320 00:06:34.497847 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:34 crc kubenswrapper[4867]: I0320 00:06:34.749768 4867 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 00:06:34 crc kubenswrapper[4867]: I0320 00:06:34.749888 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 00:06:36 crc kubenswrapper[4867]: I0320 00:06:36.141993 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 00:06:36 crc kubenswrapper[4867]: I0320 00:06:36.142244 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:36 crc kubenswrapper[4867]: I0320 00:06:36.143722 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:36 crc kubenswrapper[4867]: I0320 00:06:36.143766 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:36 crc kubenswrapper[4867]: I0320 00:06:36.143777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:36 crc kubenswrapper[4867]: E0320 00:06:36.453549 4867 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 00:06:38 crc kubenswrapper[4867]: I0320 00:06:38.357544 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:06:38 crc kubenswrapper[4867]: I0320 00:06:38.357730 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:38 crc kubenswrapper[4867]: I0320 00:06:38.359595 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:38 crc kubenswrapper[4867]: I0320 00:06:38.359651 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:38 crc kubenswrapper[4867]: I0320 00:06:38.359670 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:40 crc kubenswrapper[4867]: W0320 00:06:40.226043 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:40Z is after 2026-02-23T05:33:13Z Mar 20 00:06:40 crc kubenswrapper[4867]: E0320 00:06:40.226137 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 00:06:40 crc kubenswrapper[4867]: I0320 00:06:40.231058 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:40Z is after 2026-02-23T05:33:13Z Mar 20 00:06:40 crc kubenswrapper[4867]: E0320 00:06:40.242887 4867 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:40Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e63f0273b77bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.332063676 +0000 UTC m=+0.558601203,LastTimestamp:2026-03-20 00:06:26.332063676 +0000 UTC m=+0.558601203,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:40 crc kubenswrapper[4867]: W0320 00:06:40.248164 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:40Z is after 2026-02-23T05:33:13Z Mar 20 00:06:40 crc kubenswrapper[4867]: E0320 00:06:40.248270 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 00:06:40 crc kubenswrapper[4867]: W0320 00:06:40.248572 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:40Z is after 2026-02-23T05:33:13Z Mar 20 00:06:40 crc kubenswrapper[4867]: E0320 00:06:40.248685 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 00:06:40 crc kubenswrapper[4867]: W0320 00:06:40.254720 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:40Z is after 2026-02-23T05:33:13Z Mar 20 00:06:40 crc kubenswrapper[4867]: E0320 00:06:40.254862 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 00:06:40 crc kubenswrapper[4867]: E0320 00:06:40.255375 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:40Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 00:06:40 crc kubenswrapper[4867]: I0320 00:06:40.255551 4867 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 00:06:40 crc kubenswrapper[4867]: I0320 00:06:40.255668 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 00:06:40 crc kubenswrapper[4867]: E0320 00:06:40.256731 4867 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 00:06:40 crc kubenswrapper[4867]: I0320 00:06:40.257516 4867 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:39636->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 20 00:06:40 crc kubenswrapper[4867]: I0320 00:06:40.257592 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:39636->192.168.126.11:17697: read: connection reset by peer" Mar 20 00:06:40 crc kubenswrapper[4867]: E0320 00:06:40.258098 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:40Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 00:06:40 crc kubenswrapper[4867]: I0320 00:06:40.260445 4867 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 00:06:40 crc kubenswrapper[4867]: I0320 00:06:40.260540 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 00:06:40 crc kubenswrapper[4867]: I0320 00:06:40.335446 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:40Z is after 2026-02-23T05:33:13Z Mar 20 00:06:40 crc kubenswrapper[4867]: I0320 00:06:40.513004 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 00:06:40 crc kubenswrapper[4867]: I0320 00:06:40.515284 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4f5730a3cf507c8afd6e631fe711a050a687bb0659f5257aed2139d84b833264" exitCode=255 Mar 20 00:06:40 crc kubenswrapper[4867]: I0320 00:06:40.515328 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4f5730a3cf507c8afd6e631fe711a050a687bb0659f5257aed2139d84b833264"} Mar 20 00:06:40 crc kubenswrapper[4867]: I0320 00:06:40.515478 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:40 crc kubenswrapper[4867]: I0320 00:06:40.516277 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:40 crc kubenswrapper[4867]: I0320 00:06:40.516343 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:40 crc kubenswrapper[4867]: I0320 00:06:40.516363 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:40 crc kubenswrapper[4867]: I0320 00:06:40.517244 4867 scope.go:117] "RemoveContainer" containerID="4f5730a3cf507c8afd6e631fe711a050a687bb0659f5257aed2139d84b833264" Mar 20 00:06:41 crc kubenswrapper[4867]: I0320 00:06:41.336254 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:41Z is after 2026-02-23T05:33:13Z Mar 20 00:06:41 crc kubenswrapper[4867]: I0320 00:06:41.522049 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 00:06:41 crc kubenswrapper[4867]: I0320 00:06:41.524475 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2fc1c105dfaca16763aabc3ac4973328d19a9dea658f7569d8b10ed5377cc64b"} Mar 20 00:06:41 crc kubenswrapper[4867]: I0320 00:06:41.524705 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:41 crc kubenswrapper[4867]: I0320 00:06:41.525568 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:41 crc kubenswrapper[4867]: I0320 00:06:41.525626 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:41 crc kubenswrapper[4867]: I0320 00:06:41.525645 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:42 crc kubenswrapper[4867]: I0320 00:06:42.338666 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:42Z is after 2026-02-23T05:33:13Z Mar 20 00:06:42 crc kubenswrapper[4867]: I0320 00:06:42.529717 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 00:06:42 crc kubenswrapper[4867]: I0320 00:06:42.531045 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 00:06:42 crc kubenswrapper[4867]: I0320 00:06:42.534365 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2fc1c105dfaca16763aabc3ac4973328d19a9dea658f7569d8b10ed5377cc64b" exitCode=255 Mar 20 00:06:42 crc kubenswrapper[4867]: I0320 00:06:42.534483 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2fc1c105dfaca16763aabc3ac4973328d19a9dea658f7569d8b10ed5377cc64b"} Mar 20 00:06:42 crc kubenswrapper[4867]: I0320 00:06:42.534635 4867 scope.go:117] "RemoveContainer" containerID="4f5730a3cf507c8afd6e631fe711a050a687bb0659f5257aed2139d84b833264" Mar 20 00:06:42 crc kubenswrapper[4867]: I0320 00:06:42.534833 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:42 crc kubenswrapper[4867]: I0320 00:06:42.536211 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:42 crc kubenswrapper[4867]: I0320 00:06:42.536268 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:42 crc kubenswrapper[4867]: I0320 00:06:42.536291 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:42 crc kubenswrapper[4867]: I0320 00:06:42.537281 4867 scope.go:117] "RemoveContainer" containerID="2fc1c105dfaca16763aabc3ac4973328d19a9dea658f7569d8b10ed5377cc64b" Mar 20 00:06:42 crc kubenswrapper[4867]: E0320 00:06:42.537673 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 00:06:42 crc kubenswrapper[4867]: I0320 00:06:42.673077 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:06:43 crc kubenswrapper[4867]: I0320 00:06:43.338126 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:43Z is after 2026-02-23T05:33:13Z Mar 20 00:06:43 crc kubenswrapper[4867]: I0320 00:06:43.539692 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 00:06:43 crc kubenswrapper[4867]: I0320 00:06:43.542642 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:43 crc kubenswrapper[4867]: I0320 00:06:43.544259 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:43 crc kubenswrapper[4867]: I0320 00:06:43.544304 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:43 crc kubenswrapper[4867]: I0320 00:06:43.544323 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:43 crc kubenswrapper[4867]: I0320 00:06:43.545091 4867 scope.go:117] "RemoveContainer" containerID="2fc1c105dfaca16763aabc3ac4973328d19a9dea658f7569d8b10ed5377cc64b" Mar 20 00:06:43 crc kubenswrapper[4867]: E0320 00:06:43.545359 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 00:06:43 crc kubenswrapper[4867]: I0320 00:06:43.549276 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:06:43 crc kubenswrapper[4867]: I0320 00:06:43.653663 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 00:06:43 crc kubenswrapper[4867]: I0320 00:06:43.654310 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:43 crc kubenswrapper[4867]: I0320 00:06:43.656115 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:43 crc kubenswrapper[4867]: I0320 00:06:43.656171 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:43 crc kubenswrapper[4867]: I0320 00:06:43.656196 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:43 crc kubenswrapper[4867]: I0320 00:06:43.673672 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 00:06:44 crc kubenswrapper[4867]: W0320 00:06:44.015037 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:44Z is after 2026-02-23T05:33:13Z Mar 20 00:06:44 crc kubenswrapper[4867]: E0320 00:06:44.015099 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 00:06:44 crc kubenswrapper[4867]: I0320 00:06:44.336452 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:44Z is after 2026-02-23T05:33:13Z Mar 20 00:06:44 crc kubenswrapper[4867]: I0320 00:06:44.545837 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:44 crc kubenswrapper[4867]: I0320 00:06:44.545930 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:44 crc kubenswrapper[4867]: I0320 00:06:44.547173 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:44 crc kubenswrapper[4867]: I0320 00:06:44.547231 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:44 crc kubenswrapper[4867]: I0320 00:06:44.547251 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:44 crc kubenswrapper[4867]: I0320 00:06:44.547229 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:44 crc kubenswrapper[4867]: I0320 00:06:44.547332 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:44 crc kubenswrapper[4867]: I0320 00:06:44.547346 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:44 crc kubenswrapper[4867]: I0320 00:06:44.548247 4867 scope.go:117] "RemoveContainer" containerID="2fc1c105dfaca16763aabc3ac4973328d19a9dea658f7569d8b10ed5377cc64b" Mar 20 00:06:44 crc kubenswrapper[4867]: E0320 00:06:44.548590 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 00:06:44 crc kubenswrapper[4867]: I0320 00:06:44.761291 4867 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 00:06:44 crc kubenswrapper[4867]: I0320 00:06:44.761389 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 00:06:44 crc kubenswrapper[4867]: W0320 00:06:44.917269 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:44Z is after 2026-02-23T05:33:13Z Mar 20 00:06:44 crc kubenswrapper[4867]: E0320 00:06:44.917377 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 00:06:45 crc kubenswrapper[4867]: I0320 00:06:45.338040 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:45Z is after 2026-02-23T05:33:13Z Mar 20 00:06:46 crc kubenswrapper[4867]: I0320 00:06:46.337152 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:46Z is after 2026-02-23T05:33:13Z Mar 20 00:06:46 crc kubenswrapper[4867]: E0320 00:06:46.454048 4867 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 00:06:46 crc kubenswrapper[4867]: I0320 00:06:46.655517 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:46 crc kubenswrapper[4867]: I0320 00:06:46.657010 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:46 crc kubenswrapper[4867]: I0320 00:06:46.657065 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:46 crc kubenswrapper[4867]: I0320 00:06:46.657083 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:46 crc kubenswrapper[4867]: I0320 00:06:46.657117 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 00:06:46 crc kubenswrapper[4867]: E0320 00:06:46.661924 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:46Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 00:06:46 crc kubenswrapper[4867]: E0320 00:06:46.665916 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:06:46Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 00:06:47 crc kubenswrapper[4867]: I0320 00:06:47.339777 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:06:48 crc kubenswrapper[4867]: I0320 00:06:48.341010 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:06:48 crc kubenswrapper[4867]: I0320 00:06:48.710062 4867 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 00:06:48 crc kubenswrapper[4867]: I0320 00:06:48.738076 4867 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 00:06:48 crc kubenswrapper[4867]: I0320 00:06:48.790482 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:06:48 crc kubenswrapper[4867]: I0320 00:06:48.790752 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:48 crc kubenswrapper[4867]: I0320 00:06:48.792362 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:48 crc kubenswrapper[4867]: I0320 00:06:48.792417 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:48 crc kubenswrapper[4867]: I0320 00:06:48.792436 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:48 crc kubenswrapper[4867]: I0320 00:06:48.793319 4867 scope.go:117] "RemoveContainer" containerID="2fc1c105dfaca16763aabc3ac4973328d19a9dea658f7569d8b10ed5377cc64b" Mar 20 00:06:48 crc kubenswrapper[4867]: E0320 00:06:48.793678 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 00:06:49 crc kubenswrapper[4867]: I0320 00:06:49.337911 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:06:49 crc kubenswrapper[4867]: I0320 00:06:49.451937 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:06:49 crc kubenswrapper[4867]: I0320 00:06:49.557997 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:49 crc kubenswrapper[4867]: I0320 00:06:49.559484 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:49 crc kubenswrapper[4867]: I0320 00:06:49.559593 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:49 crc kubenswrapper[4867]: I0320 00:06:49.559612 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:49 crc kubenswrapper[4867]: I0320 00:06:49.560647 4867 scope.go:117] "RemoveContainer" containerID="2fc1c105dfaca16763aabc3ac4973328d19a9dea658f7569d8b10ed5377cc64b" Mar 20 00:06:49 crc kubenswrapper[4867]: E0320 00:06:49.561021 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.250913 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f0273b77bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.332063676 +0000 UTC m=+0.558601203,LastTimestamp:2026-03-20 00:06:26.332063676 +0000 UTC m=+0.558601203,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.258129 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aaa8107 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389672199 +0000 UTC m=+0.616209736,LastTimestamp:2026-03-20 00:06:26.389672199 +0000 UTC m=+0.616209736,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.263712 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aaafe1a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389704218 +0000 UTC m=+0.616241745,LastTimestamp:2026-03-20 00:06:26.389704218 +0000 UTC m=+0.616241745,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.268801 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aab3916 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389719318 +0000 UTC m=+0.616256855,LastTimestamp:2026-03-20 00:06:26.389719318 +0000 UTC m=+0.616256855,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.276292 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02ecfdd8a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.459229578 +0000 UTC m=+0.685767105,LastTimestamp:2026-03-20 00:06:26.459229578 +0000 UTC m=+0.685767105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.282273 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aaa8107\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aaa8107 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389672199 +0000 UTC m=+0.616209736,LastTimestamp:2026-03-20 00:06:26.521612867 +0000 UTC m=+0.748150404,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.288612 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aaafe1a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aaafe1a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389704218 +0000 UTC m=+0.616241745,LastTimestamp:2026-03-20 00:06:26.521641357 +0000 UTC m=+0.748178894,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.294288 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aab3916\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aab3916 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389719318 +0000 UTC m=+0.616256855,LastTimestamp:2026-03-20 00:06:26.521654157 +0000 UTC m=+0.748191694,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.300597 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aaa8107\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aaa8107 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389672199 +0000 UTC m=+0.616209736,LastTimestamp:2026-03-20 00:06:26.522954198 +0000 UTC m=+0.749491715,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.307890 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aaafe1a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aaafe1a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389704218 +0000 UTC m=+0.616241745,LastTimestamp:2026-03-20 00:06:26.522972288 +0000 UTC m=+0.749509805,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.313714 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aab3916\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aab3916 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389719318 +0000 UTC m=+0.616256855,LastTimestamp:2026-03-20 00:06:26.522980828 +0000 UTC m=+0.749518345,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.318966 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aaa8107\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aaa8107 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389672199 +0000 UTC m=+0.616209736,LastTimestamp:2026-03-20 00:06:26.523195017 +0000 UTC m=+0.749732544,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.324035 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aaafe1a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aaafe1a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389704218 +0000 UTC m=+0.616241745,LastTimestamp:2026-03-20 00:06:26.523210527 +0000 UTC m=+0.749748054,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.330707 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aab3916\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aab3916 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389719318 +0000 UTC m=+0.616256855,LastTimestamp:2026-03-20 00:06:26.523223016 +0000 UTC m=+0.749760543,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.336057 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aaa8107\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aaa8107 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389672199 +0000 UTC m=+0.616209736,LastTimestamp:2026-03-20 00:06:26.523957401 +0000 UTC m=+0.750494938,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: I0320 00:06:50.336336 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.338362 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aaafe1a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aaafe1a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389704218 +0000 UTC m=+0.616241745,LastTimestamp:2026-03-20 00:06:26.523979941 +0000 UTC m=+0.750517468,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.342378 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aab3916\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aab3916 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389719318 +0000 UTC m=+0.616256855,LastTimestamp:2026-03-20 00:06:26.523990471 +0000 UTC m=+0.750527998,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.346123 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aaa8107\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aaa8107 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389672199 +0000 UTC m=+0.616209736,LastTimestamp:2026-03-20 00:06:26.524841205 +0000 UTC m=+0.751378722,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.351049 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aaafe1a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aaafe1a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389704218 +0000 UTC m=+0.616241745,LastTimestamp:2026-03-20 00:06:26.524867835 +0000 UTC m=+0.751405352,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.356479 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aab3916\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aab3916 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389719318 +0000 UTC m=+0.616256855,LastTimestamp:2026-03-20 00:06:26.524874765 +0000 UTC m=+0.751412282,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.361931 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aaa8107\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aaa8107 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389672199 +0000 UTC m=+0.616209736,LastTimestamp:2026-03-20 00:06:26.525039444 +0000 UTC m=+0.751576971,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.364095 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aaafe1a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aaafe1a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389704218 +0000 UTC m=+0.616241745,LastTimestamp:2026-03-20 00:06:26.525056424 +0000 UTC m=+0.751593961,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.369042 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aab3916\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aab3916 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389719318 +0000 UTC m=+0.616256855,LastTimestamp:2026-03-20 00:06:26.525068474 +0000 UTC m=+0.751606001,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.375933 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aaa8107\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aaa8107 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389672199 +0000 UTC m=+0.616209736,LastTimestamp:2026-03-20 00:06:26.525287172 +0000 UTC m=+0.751824699,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.381013 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e63f02aaafe1a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e63f02aaafe1a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.389704218 +0000 UTC m=+0.616241745,LastTimestamp:2026-03-20 00:06:26.525299522 +0000 UTC m=+0.751837049,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.387402 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f049ecbe8d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.914107021 +0000 UTC m=+1.140644578,LastTimestamp:2026-03-20 00:06:26.914107021 +0000 UTC m=+1.140644578,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.392378 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f049eced32 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.914118962 +0000 UTC m=+1.140656499,LastTimestamp:2026-03-20 00:06:26.914118962 +0000 UTC m=+1.140656499,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.397651 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f049ed8a95 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.914159253 +0000 UTC m=+1.140696760,LastTimestamp:2026-03-20 00:06:26.914159253 +0000 UTC m=+1.140696760,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.403298 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e63f049edaec2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.914168514 +0000 UTC m=+1.140706071,LastTimestamp:2026-03-20 00:06:26.914168514 +0000 UTC m=+1.140706071,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.409961 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e63f04aa84a9c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:26.926398108 +0000 UTC m=+1.152935625,LastTimestamp:2026-03-20 00:06:26.926398108 +0000 UTC m=+1.152935625,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.415482 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e63f06d6fc23b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:27.509895739 +0000 UTC m=+1.736433256,LastTimestamp:2026-03-20 00:06:27.509895739 +0000 UTC m=+1.736433256,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.421443 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f06d70f9ec openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:27.509975532 +0000 UTC m=+1.736513049,LastTimestamp:2026-03-20 00:06:27.509975532 +0000 UTC m=+1.736513049,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.428917 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f06d71cc53 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:27.510029395 +0000 UTC m=+1.736566912,LastTimestamp:2026-03-20 00:06:27.510029395 +0000 UTC m=+1.736566912,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.435625 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e63f06db2f966 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:27.514300774 +0000 UTC m=+1.740838291,LastTimestamp:2026-03-20 00:06:27.514300774 +0000 UTC m=+1.740838291,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.442075 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f06e247da6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:27.521740198 +0000 UTC m=+1.748277715,LastTimestamp:2026-03-20 00:06:27.521740198 +0000 UTC m=+1.748277715,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.447808 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f06e35eaa5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:27.522882213 +0000 UTC m=+1.749419730,LastTimestamp:2026-03-20 00:06:27.522882213 +0000 UTC m=+1.749419730,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.453756 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e63f06e570bb9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:27.525053369 +0000 UTC m=+1.751590886,LastTimestamp:2026-03-20 00:06:27.525053369 +0000 UTC m=+1.751590886,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.459270 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f06e6a0d95 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:27.526299029 +0000 UTC m=+1.752836566,LastTimestamp:2026-03-20 00:06:27.526299029 +0000 UTC m=+1.752836566,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.467638 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f06e6e1dd1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:27.526565329 +0000 UTC m=+1.753102846,LastTimestamp:2026-03-20 00:06:27.526565329 +0000 UTC m=+1.753102846,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.472218 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e63f06e86a139 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:27.528171833 +0000 UTC m=+1.754709350,LastTimestamp:2026-03-20 00:06:27.528171833 +0000 UTC m=+1.754709350,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.476819 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f06f5f25c9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:27.542361545 +0000 UTC m=+1.768899062,LastTimestamp:2026-03-20 00:06:27.542361545 +0000 UTC m=+1.768899062,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.483628 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f081951493 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:27.847885971 +0000 UTC m=+2.074423528,LastTimestamp:2026-03-20 00:06:27.847885971 +0000 UTC m=+2.074423528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.490016 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f08215b826 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:27.856316454 +0000 UTC m=+2.082853971,LastTimestamp:2026-03-20 00:06:27.856316454 +0000 UTC m=+2.082853971,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.497354 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f082298e52 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:27.857616466 +0000 UTC m=+2.084153983,LastTimestamp:2026-03-20 00:06:27.857616466 +0000 UTC m=+2.084153983,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.503000 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f08cb1b9cb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.034312651 +0000 UTC m=+2.260850168,LastTimestamp:2026-03-20 00:06:28.034312651 +0000 UTC m=+2.260850168,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.510395 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f08d54ca1b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.044999195 +0000 UTC m=+2.271536712,LastTimestamp:2026-03-20 00:06:28.044999195 +0000 UTC m=+2.271536712,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.515924 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f08d68aaf0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.046301936 +0000 UTC m=+2.272839453,LastTimestamp:2026-03-20 00:06:28.046301936 +0000 UTC m=+2.272839453,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.520895 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f098dd0352 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.238476114 +0000 UTC m=+2.465013641,LastTimestamp:2026-03-20 00:06:28.238476114 +0000 UTC m=+2.465013641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.525737 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f099a9a773 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.251887475 +0000 UTC m=+2.478425002,LastTimestamp:2026-03-20 00:06:28.251887475 +0000 UTC m=+2.478425002,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.532571 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f0a4d3f27c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.439208572 +0000 UTC m=+2.665746119,LastTimestamp:2026-03-20 00:06:28.439208572 +0000 UTC m=+2.665746119,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.538720 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f0a5b791b2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.454126002 +0000 UTC m=+2.680663519,LastTimestamp:2026-03-20 00:06:28.454126002 +0000 UTC m=+2.680663519,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.545090 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e63f0a5b7a738 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.454131512 +0000 UTC m=+2.680669029,LastTimestamp:2026-03-20 00:06:28.454131512 +0000 UTC m=+2.680669029,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.550876 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e63f0a5b88679 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.454188665 +0000 UTC m=+2.680726182,LastTimestamp:2026-03-20 00:06:28.454188665 +0000 UTC m=+2.680726182,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.557394 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f0b611780b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.728453131 +0000 UTC m=+2.954990648,LastTimestamp:2026-03-20 00:06:28.728453131 +0000 UTC m=+2.954990648,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.563709 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e63f0b614e3c5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.728677317 +0000 UTC m=+2.955214834,LastTimestamp:2026-03-20 00:06:28.728677317 +0000 UTC m=+2.955214834,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.569080 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e63f0b619b0b5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.728991925 +0000 UTC m=+2.955529442,LastTimestamp:2026-03-20 00:06:28.728991925 +0000 UTC m=+2.955529442,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.574284 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f0b6247f6c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.729700204 +0000 UTC m=+2.956237721,LastTimestamp:2026-03-20 00:06:28.729700204 +0000 UTC m=+2.956237721,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.579650 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e63f0b72b2c38 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.746914872 +0000 UTC m=+2.973452389,LastTimestamp:2026-03-20 00:06:28.746914872 +0000 UTC m=+2.973452389,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.584264 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e63f0b72d1a2d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.747041325 +0000 UTC m=+2.973578832,LastTimestamp:2026-03-20 00:06:28.747041325 +0000 UTC m=+2.973578832,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.594035 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f0b72f53cd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.747187149 +0000 UTC m=+2.973724666,LastTimestamp:2026-03-20 00:06:28.747187149 +0000 UTC m=+2.973724666,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.601994 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e63f0b774f46a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.75175025 +0000 UTC m=+2.978287767,LastTimestamp:2026-03-20 00:06:28.75175025 +0000 UTC m=+2.978287767,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.609948 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f0b77aee1a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.75214185 +0000 UTC m=+2.978679367,LastTimestamp:2026-03-20 00:06:28.75214185 +0000 UTC m=+2.978679367,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.617301 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f0b7d85cd3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.758265043 +0000 UTC m=+2.984802560,LastTimestamp:2026-03-20 00:06:28.758265043 +0000 UTC m=+2.984802560,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.623949 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f0c3eade65 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.960804453 +0000 UTC m=+3.187341970,LastTimestamp:2026-03-20 00:06:28.960804453 +0000 UTC m=+3.187341970,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.630290 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e63f0c3fcdbd5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.961983445 +0000 UTC m=+3.188520962,LastTimestamp:2026-03-20 00:06:28.961983445 +0000 UTC m=+3.188520962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.634870 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f0c4b139b7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.973803959 +0000 UTC m=+3.200341476,LastTimestamp:2026-03-20 00:06:28.973803959 +0000 UTC m=+3.200341476,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.642010 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f0c4c4f9d3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.975098323 +0000 UTC m=+3.201635840,LastTimestamp:2026-03-20 00:06:28.975098323 +0000 UTC m=+3.201635840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.649579 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e63f0c4d1cfc5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.975939525 +0000 UTC m=+3.202477042,LastTimestamp:2026-03-20 00:06:28.975939525 +0000 UTC m=+3.202477042,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.657389 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e63f0c4ddffef openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:28.976738287 +0000 UTC m=+3.203275804,LastTimestamp:2026-03-20 00:06:28.976738287 +0000 UTC m=+3.203275804,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.665219 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e63f0d08fae34 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:29.172932148 +0000 UTC m=+3.399469665,LastTimestamp:2026-03-20 00:06:29.172932148 +0000 UTC m=+3.399469665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.672827 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f0d0997a79 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:29.173574265 +0000 UTC m=+3.400111782,LastTimestamp:2026-03-20 00:06:29.173574265 +0000 UTC m=+3.400111782,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.676603 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f0d14806c9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:29.185013449 +0000 UTC m=+3.411550966,LastTimestamp:2026-03-20 00:06:29.185013449 +0000 UTC m=+3.411550966,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.682476 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f0d157622e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:29.186019886 +0000 UTC m=+3.412557403,LastTimestamp:2026-03-20 00:06:29.186019886 +0000 UTC m=+3.412557403,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.688931 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e63f0d167b898 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:29.187090584 +0000 UTC m=+3.413628101,LastTimestamp:2026-03-20 00:06:29.187090584 +0000 UTC m=+3.413628101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.695253 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f0dc980ed1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:29.374807761 +0000 UTC m=+3.601345278,LastTimestamp:2026-03-20 00:06:29.374807761 +0000 UTC m=+3.601345278,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.702982 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f0dd540336 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:29.387125558 +0000 UTC m=+3.613663075,LastTimestamp:2026-03-20 00:06:29.387125558 +0000 UTC m=+3.613663075,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.707479 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f0dd67e26b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:29.388427883 +0000 UTC m=+3.614965400,LastTimestamp:2026-03-20 00:06:29.388427883 +0000 UTC m=+3.614965400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.713891 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f0e1e1a9b2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:29.463517618 +0000 UTC m=+3.690055125,LastTimestamp:2026-03-20 00:06:29.463517618 +0000 UTC m=+3.690055125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.719060 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f0ed18c43a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:29.651678266 +0000 UTC m=+3.878215783,LastTimestamp:2026-03-20 00:06:29.651678266 +0000 UTC m=+3.878215783,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.724728 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f0edb8e4ff openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:29.662172415 +0000 UTC m=+3.888709932,LastTimestamp:2026-03-20 00:06:29.662172415 +0000 UTC m=+3.888709932,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.730901 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f0edf5cddf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:29.666164191 +0000 UTC m=+3.892701708,LastTimestamp:2026-03-20 00:06:29.666164191 +0000 UTC m=+3.892701708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.735397 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f0eeb94611 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:29.678974481 +0000 UTC m=+3.905512008,LastTimestamp:2026-03-20 00:06:29.678974481 +0000 UTC m=+3.905512008,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.741387 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f11ec3cd81 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:30.484970881 +0000 UTC m=+4.711508388,LastTimestamp:2026-03-20 00:06:30.484970881 +0000 UTC m=+4.711508388,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.746133 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f12a1804b2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:30.67503941 +0000 UTC m=+4.901576937,LastTimestamp:2026-03-20 00:06:30.67503941 +0000 UTC m=+4.901576937,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.750647 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f12a956392 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:30.683255698 +0000 UTC m=+4.909793215,LastTimestamp:2026-03-20 00:06:30.683255698 +0000 UTC m=+4.909793215,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.757080 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f12aa2b21b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:30.684127771 +0000 UTC m=+4.910665288,LastTimestamp:2026-03-20 00:06:30.684127771 +0000 UTC m=+4.910665288,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.763612 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f134ce4e88 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:30.854758024 +0000 UTC m=+5.081295541,LastTimestamp:2026-03-20 00:06:30.854758024 +0000 UTC m=+5.081295541,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.768859 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f135751178 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:30.865686904 +0000 UTC m=+5.092224421,LastTimestamp:2026-03-20 00:06:30.865686904 +0000 UTC m=+5.092224421,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.774115 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f1358073ce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:30.866432974 +0000 UTC m=+5.092970491,LastTimestamp:2026-03-20 00:06:30.866432974 +0000 UTC m=+5.092970491,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.778753 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f13fa0b141 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:31.036318017 +0000 UTC m=+5.262855534,LastTimestamp:2026-03-20 00:06:31.036318017 +0000 UTC m=+5.262855534,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.783837 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f140518d7f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:31.047908735 +0000 UTC m=+5.274446252,LastTimestamp:2026-03-20 00:06:31.047908735 +0000 UTC m=+5.274446252,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.789697 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f140673d2a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:31.049329962 +0000 UTC m=+5.275867479,LastTimestamp:2026-03-20 00:06:31.049329962 +0000 UTC m=+5.275867479,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.795308 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f14da40d10 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:31.271419152 +0000 UTC m=+5.497956699,LastTimestamp:2026-03-20 00:06:31.271419152 +0000 UTC m=+5.497956699,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.800885 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f14e85888c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:31.286196364 +0000 UTC m=+5.512733891,LastTimestamp:2026-03-20 00:06:31.286196364 +0000 UTC m=+5.512733891,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.806474 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f14e9ea7d0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:31.287842768 +0000 UTC m=+5.514380285,LastTimestamp:2026-03-20 00:06:31.287842768 +0000 UTC m=+5.514380285,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.811239 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f1579c3a93 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:31.438678675 +0000 UTC m=+5.665216202,LastTimestamp:2026-03-20 00:06:31.438678675 +0000 UTC m=+5.665216202,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.815898 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e63f15832e1bb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:31.448551867 +0000 UTC m=+5.675089384,LastTimestamp:2026-03-20 00:06:31.448551867 +0000 UTC m=+5.675089384,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.823672 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 00:06:50 crc kubenswrapper[4867]: &Event{ObjectMeta:{kube-controller-manager-crc.189e63f21cf8c53a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 00:06:50 crc kubenswrapper[4867]: body: Mar 20 00:06:50 crc kubenswrapper[4867]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:34.749855034 +0000 UTC m=+8.976392591,LastTimestamp:2026-03-20 00:06:34.749855034 +0000 UTC m=+8.976392591,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 00:06:50 crc kubenswrapper[4867]: > Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.828884 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f21cf9ec6e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:34.749930606 +0000 UTC m=+8.976468163,LastTimestamp:2026-03-20 00:06:34.749930606 +0000 UTC m=+8.976468163,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.834150 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 00:06:50 crc kubenswrapper[4867]: &Event{ObjectMeta:{kube-apiserver-crc.189e63f36523e90b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 00:06:50 crc kubenswrapper[4867]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 00:06:50 crc kubenswrapper[4867]: Mar 20 00:06:50 crc kubenswrapper[4867]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:40.255609099 +0000 UTC m=+14.482146656,LastTimestamp:2026-03-20 00:06:40.255609099 +0000 UTC m=+14.482146656,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 00:06:50 crc kubenswrapper[4867]: > Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.839096 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f365263a75 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:40.255761013 +0000 UTC m=+14.482298570,LastTimestamp:2026-03-20 00:06:40.255761013 +0000 UTC m=+14.482298570,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.843892 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 00:06:50 crc kubenswrapper[4867]: &Event{ObjectMeta:{kube-apiserver-crc.189e63f36541dc6d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:39636->192.168.126.11:17697: read: connection reset by peer Mar 20 00:06:50 crc kubenswrapper[4867]: body: Mar 20 00:06:50 crc kubenswrapper[4867]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:40.257571949 +0000 UTC m=+14.484109486,LastTimestamp:2026-03-20 00:06:40.257571949 +0000 UTC m=+14.484109486,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 00:06:50 crc kubenswrapper[4867]: > Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.848461 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f36542abbd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:39636->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:40.257625021 +0000 UTC m=+14.484162548,LastTimestamp:2026-03-20 00:06:40.257625021 +0000 UTC m=+14.484162548,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.853036 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e63f36523e90b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 00:06:50 crc kubenswrapper[4867]: &Event{ObjectMeta:{kube-apiserver-crc.189e63f36523e90b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 00:06:50 crc kubenswrapper[4867]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 00:06:50 crc kubenswrapper[4867]: Mar 20 00:06:50 crc kubenswrapper[4867]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:40.255609099 +0000 UTC m=+14.482146656,LastTimestamp:2026-03-20 00:06:40.260518095 +0000 UTC m=+14.487055622,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 00:06:50 crc kubenswrapper[4867]: > Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.857610 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e63f365263a75\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f365263a75 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:40.255761013 +0000 UTC m=+14.482298570,LastTimestamp:2026-03-20 00:06:40.260573456 +0000 UTC m=+14.487110993,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.862886 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e63f0dd67e26b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e63f0dd67e26b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:29.388427883 +0000 UTC m=+3.614965400,LastTimestamp:2026-03-20 00:06:40.5187284 +0000 UTC m=+14.745265917,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.869600 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 00:06:50 crc kubenswrapper[4867]: &Event{ObjectMeta:{kube-controller-manager-crc.189e63f471b451c8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 00:06:50 crc kubenswrapper[4867]: body: Mar 20 00:06:50 crc kubenswrapper[4867]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:44.761366984 +0000 UTC m=+18.987904531,LastTimestamp:2026-03-20 00:06:44.761366984 +0000 UTC m=+18.987904531,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 00:06:50 crc kubenswrapper[4867]: > Mar 20 00:06:50 crc kubenswrapper[4867]: E0320 00:06:50.876115 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f471b538a6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:44.761426086 +0000 UTC m=+18.987963643,LastTimestamp:2026-03-20 00:06:44.761426086 +0000 UTC m=+18.987963643,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:51 crc kubenswrapper[4867]: I0320 00:06:51.340770 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:06:52 crc kubenswrapper[4867]: I0320 00:06:52.339843 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:06:52 crc kubenswrapper[4867]: W0320 00:06:52.366362 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 00:06:52 crc kubenswrapper[4867]: E0320 00:06:52.366484 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 00:06:52 crc kubenswrapper[4867]: W0320 00:06:52.762693 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 00:06:52 crc kubenswrapper[4867]: E0320 00:06:52.763114 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 00:06:53 crc kubenswrapper[4867]: I0320 00:06:53.338538 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:06:53 crc kubenswrapper[4867]: I0320 00:06:53.662471 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:53 crc kubenswrapper[4867]: I0320 00:06:53.664661 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:53 crc kubenswrapper[4867]: I0320 00:06:53.664748 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:53 crc kubenswrapper[4867]: I0320 00:06:53.664790 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:53 crc kubenswrapper[4867]: I0320 00:06:53.664836 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 00:06:53 crc kubenswrapper[4867]: E0320 00:06:53.677388 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 00:06:53 crc kubenswrapper[4867]: E0320 00:06:53.678089 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 00:06:54 crc kubenswrapper[4867]: I0320 00:06:54.343605 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:06:54 crc kubenswrapper[4867]: I0320 00:06:54.750655 4867 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 00:06:54 crc kubenswrapper[4867]: I0320 00:06:54.750735 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 00:06:54 crc kubenswrapper[4867]: I0320 00:06:54.750808 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:06:54 crc kubenswrapper[4867]: I0320 00:06:54.751044 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:54 crc kubenswrapper[4867]: I0320 00:06:54.752870 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:54 crc kubenswrapper[4867]: I0320 00:06:54.752972 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:54 crc kubenswrapper[4867]: I0320 00:06:54.753036 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:54 crc kubenswrapper[4867]: I0320 00:06:54.754126 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"5fbafffb1e4506f28b3b8343e36bff5d861957c94a7cbd766f3a63969f5d3b54"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 00:06:54 crc kubenswrapper[4867]: I0320 00:06:54.754573 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://5fbafffb1e4506f28b3b8343e36bff5d861957c94a7cbd766f3a63969f5d3b54" gracePeriod=30 Mar 20 00:06:54 crc kubenswrapper[4867]: E0320 00:06:54.759302 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e63f471b451c8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 00:06:54 crc kubenswrapper[4867]: &Event{ObjectMeta:{kube-controller-manager-crc.189e63f471b451c8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 00:06:54 crc kubenswrapper[4867]: body: Mar 20 00:06:54 crc kubenswrapper[4867]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:44.761366984 +0000 UTC m=+18.987904531,LastTimestamp:2026-03-20 00:06:54.750714394 +0000 UTC m=+28.977251951,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 00:06:54 crc kubenswrapper[4867]: > Mar 20 00:06:54 crc kubenswrapper[4867]: E0320 00:06:54.771975 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e63f471b538a6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f471b538a6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:44.761426086 +0000 UTC m=+18.987963643,LastTimestamp:2026-03-20 00:06:54.750772496 +0000 UTC m=+28.977310053,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:54 crc kubenswrapper[4867]: E0320 00:06:54.781018 4867 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f6c5582117 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:54.754545943 +0000 UTC m=+28.981083500,LastTimestamp:2026-03-20 00:06:54.754545943 +0000 UTC m=+28.981083500,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:54 crc kubenswrapper[4867]: E0320 00:06:54.892930 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e63f06e35eaa5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f06e35eaa5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:27.522882213 +0000 UTC m=+1.749419730,LastTimestamp:2026-03-20 00:06:54.884137203 +0000 UTC m=+29.110674760,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:55 crc kubenswrapper[4867]: E0320 00:06:55.141242 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e63f081951493\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f081951493 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:27.847885971 +0000 UTC m=+2.074423528,LastTimestamp:2026-03-20 00:06:55.135458281 +0000 UTC m=+29.361995808,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:55 crc kubenswrapper[4867]: E0320 00:06:55.155211 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e63f08215b826\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f08215b826 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:27.856316454 +0000 UTC m=+2.082853971,LastTimestamp:2026-03-20 00:06:55.149175094 +0000 UTC m=+29.375712611,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:06:55 crc kubenswrapper[4867]: I0320 00:06:55.339060 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:06:55 crc kubenswrapper[4867]: I0320 00:06:55.581544 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 00:06:55 crc kubenswrapper[4867]: I0320 00:06:55.582238 4867 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="5fbafffb1e4506f28b3b8343e36bff5d861957c94a7cbd766f3a63969f5d3b54" exitCode=255 Mar 20 00:06:55 crc kubenswrapper[4867]: I0320 00:06:55.582305 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"5fbafffb1e4506f28b3b8343e36bff5d861957c94a7cbd766f3a63969f5d3b54"} Mar 20 00:06:55 crc kubenswrapper[4867]: I0320 00:06:55.582349 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e2409770964a1b705e256edadb4d2dd9515aee220e3fd6e01d066b5fed2c5628"} Mar 20 00:06:55 crc kubenswrapper[4867]: I0320 00:06:55.582521 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:06:55 crc kubenswrapper[4867]: I0320 00:06:55.584013 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:06:55 crc kubenswrapper[4867]: I0320 00:06:55.584133 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:06:55 crc kubenswrapper[4867]: I0320 00:06:55.584206 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:06:56 crc kubenswrapper[4867]: W0320 00:06:56.105665 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 00:06:56 crc kubenswrapper[4867]: E0320 00:06:56.105739 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 00:06:56 crc kubenswrapper[4867]: W0320 00:06:56.277673 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 00:06:56 crc kubenswrapper[4867]: E0320 00:06:56.277739 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 00:06:56 crc kubenswrapper[4867]: I0320 00:06:56.339699 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:06:56 crc kubenswrapper[4867]: E0320 00:06:56.454276 4867 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 00:06:57 crc kubenswrapper[4867]: I0320 00:06:57.334660 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:06:58 crc kubenswrapper[4867]: I0320 00:06:58.339678 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:06:59 crc kubenswrapper[4867]: I0320 00:06:59.339783 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:00 crc kubenswrapper[4867]: I0320 00:07:00.341352 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:00 crc kubenswrapper[4867]: I0320 00:07:00.679357 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:00 crc kubenswrapper[4867]: I0320 00:07:00.682057 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:00 crc kubenswrapper[4867]: I0320 00:07:00.682134 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:00 crc kubenswrapper[4867]: I0320 00:07:00.682154 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:00 crc kubenswrapper[4867]: I0320 00:07:00.682203 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 00:07:00 crc kubenswrapper[4867]: E0320 00:07:00.684741 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 00:07:00 crc kubenswrapper[4867]: E0320 00:07:00.686048 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 00:07:01 crc kubenswrapper[4867]: I0320 00:07:01.341019 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:01 crc kubenswrapper[4867]: I0320 00:07:01.749582 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:07:01 crc kubenswrapper[4867]: I0320 00:07:01.749874 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:01 crc kubenswrapper[4867]: I0320 00:07:01.751407 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:01 crc kubenswrapper[4867]: I0320 00:07:01.751470 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:01 crc kubenswrapper[4867]: I0320 00:07:01.751486 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:02 crc kubenswrapper[4867]: I0320 00:07:02.337600 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:02 crc kubenswrapper[4867]: I0320 00:07:02.420994 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:02 crc kubenswrapper[4867]: I0320 00:07:02.421863 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:02 crc kubenswrapper[4867]: I0320 00:07:02.421922 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:02 crc kubenswrapper[4867]: I0320 00:07:02.421945 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:02 crc kubenswrapper[4867]: I0320 00:07:02.422736 4867 scope.go:117] "RemoveContainer" containerID="2fc1c105dfaca16763aabc3ac4973328d19a9dea658f7569d8b10ed5377cc64b" Mar 20 00:07:03 crc kubenswrapper[4867]: I0320 00:07:03.311479 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:07:03 crc kubenswrapper[4867]: I0320 00:07:03.311696 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:03 crc kubenswrapper[4867]: I0320 00:07:03.313088 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:03 crc kubenswrapper[4867]: I0320 00:07:03.313132 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:03 crc kubenswrapper[4867]: I0320 00:07:03.313148 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:03 crc kubenswrapper[4867]: I0320 00:07:03.340039 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:03 crc kubenswrapper[4867]: I0320 00:07:03.608621 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 00:07:03 crc kubenswrapper[4867]: I0320 00:07:03.609405 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 00:07:03 crc kubenswrapper[4867]: I0320 00:07:03.611865 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1e26a8343f24ad423b557e412deec257f373c763e85d3ca466d43704e73de37c" exitCode=255 Mar 20 00:07:03 crc kubenswrapper[4867]: I0320 00:07:03.611920 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1e26a8343f24ad423b557e412deec257f373c763e85d3ca466d43704e73de37c"} Mar 20 00:07:03 crc kubenswrapper[4867]: I0320 00:07:03.611976 4867 scope.go:117] "RemoveContainer" containerID="2fc1c105dfaca16763aabc3ac4973328d19a9dea658f7569d8b10ed5377cc64b" Mar 20 00:07:03 crc kubenswrapper[4867]: I0320 00:07:03.612161 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:03 crc kubenswrapper[4867]: I0320 00:07:03.613237 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:03 crc kubenswrapper[4867]: I0320 00:07:03.613269 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:03 crc kubenswrapper[4867]: I0320 00:07:03.613285 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:03 crc kubenswrapper[4867]: I0320 00:07:03.613985 4867 scope.go:117] "RemoveContainer" containerID="1e26a8343f24ad423b557e412deec257f373c763e85d3ca466d43704e73de37c" Mar 20 00:07:03 crc kubenswrapper[4867]: E0320 00:07:03.614224 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 00:07:04 crc kubenswrapper[4867]: I0320 00:07:04.339299 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:04 crc kubenswrapper[4867]: I0320 00:07:04.617630 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 00:07:04 crc kubenswrapper[4867]: I0320 00:07:04.749870 4867 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 00:07:04 crc kubenswrapper[4867]: I0320 00:07:04.749945 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 00:07:04 crc kubenswrapper[4867]: E0320 00:07:04.759853 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e63f471b451c8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 00:07:04 crc kubenswrapper[4867]: &Event{ObjectMeta:{kube-controller-manager-crc.189e63f471b451c8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 00:07:04 crc kubenswrapper[4867]: body: Mar 20 00:07:04 crc kubenswrapper[4867]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:44.761366984 +0000 UTC m=+18.987904531,LastTimestamp:2026-03-20 00:07:04.749925777 +0000 UTC m=+38.976463324,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 00:07:04 crc kubenswrapper[4867]: > Mar 20 00:07:04 crc kubenswrapper[4867]: E0320 00:07:04.768332 4867 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e63f471b538a6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e63f471b538a6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:06:44.761426086 +0000 UTC m=+18.987963643,LastTimestamp:2026-03-20 00:07:04.749977818 +0000 UTC m=+38.976515405,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:07:05 crc kubenswrapper[4867]: I0320 00:07:05.340716 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:06 crc kubenswrapper[4867]: I0320 00:07:06.339912 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:06 crc kubenswrapper[4867]: E0320 00:07:06.454787 4867 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 00:07:07 crc kubenswrapper[4867]: I0320 00:07:07.341202 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:07 crc kubenswrapper[4867]: W0320 00:07:07.372331 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 00:07:07 crc kubenswrapper[4867]: E0320 00:07:07.372426 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 00:07:07 crc kubenswrapper[4867]: I0320 00:07:07.685680 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:07 crc kubenswrapper[4867]: I0320 00:07:07.687322 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:07 crc kubenswrapper[4867]: I0320 00:07:07.687387 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:07 crc kubenswrapper[4867]: I0320 00:07:07.687412 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:07 crc kubenswrapper[4867]: I0320 00:07:07.687452 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 00:07:07 crc kubenswrapper[4867]: E0320 00:07:07.696472 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 00:07:07 crc kubenswrapper[4867]: E0320 00:07:07.696966 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 00:07:08 crc kubenswrapper[4867]: I0320 00:07:08.338011 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:08 crc kubenswrapper[4867]: I0320 00:07:08.790226 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:07:08 crc kubenswrapper[4867]: I0320 00:07:08.790407 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:08 crc kubenswrapper[4867]: I0320 00:07:08.791723 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:08 crc kubenswrapper[4867]: I0320 00:07:08.791794 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:08 crc kubenswrapper[4867]: I0320 00:07:08.791814 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:08 crc kubenswrapper[4867]: I0320 00:07:08.792721 4867 scope.go:117] "RemoveContainer" containerID="1e26a8343f24ad423b557e412deec257f373c763e85d3ca466d43704e73de37c" Mar 20 00:07:08 crc kubenswrapper[4867]: E0320 00:07:08.793022 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 00:07:09 crc kubenswrapper[4867]: I0320 00:07:09.340255 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:09 crc kubenswrapper[4867]: I0320 00:07:09.452852 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:07:09 crc kubenswrapper[4867]: I0320 00:07:09.634802 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:09 crc kubenswrapper[4867]: I0320 00:07:09.635878 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:09 crc kubenswrapper[4867]: I0320 00:07:09.635944 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:09 crc kubenswrapper[4867]: I0320 00:07:09.635963 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:09 crc kubenswrapper[4867]: I0320 00:07:09.636882 4867 scope.go:117] "RemoveContainer" containerID="1e26a8343f24ad423b557e412deec257f373c763e85d3ca466d43704e73de37c" Mar 20 00:07:09 crc kubenswrapper[4867]: E0320 00:07:09.637252 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 00:07:10 crc kubenswrapper[4867]: I0320 00:07:10.339639 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:11 crc kubenswrapper[4867]: I0320 00:07:11.338449 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:12 crc kubenswrapper[4867]: I0320 00:07:12.339614 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:12 crc kubenswrapper[4867]: I0320 00:07:12.727926 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:07:12 crc kubenswrapper[4867]: I0320 00:07:12.728157 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:12 crc kubenswrapper[4867]: I0320 00:07:12.730251 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:12 crc kubenswrapper[4867]: I0320 00:07:12.730315 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:12 crc kubenswrapper[4867]: I0320 00:07:12.730339 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:12 crc kubenswrapper[4867]: I0320 00:07:12.735333 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:07:13 crc kubenswrapper[4867]: I0320 00:07:13.339206 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:13 crc kubenswrapper[4867]: I0320 00:07:13.646580 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:13 crc kubenswrapper[4867]: I0320 00:07:13.647835 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:13 crc kubenswrapper[4867]: I0320 00:07:13.647904 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:13 crc kubenswrapper[4867]: I0320 00:07:13.647924 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:14 crc kubenswrapper[4867]: I0320 00:07:14.335092 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:14 crc kubenswrapper[4867]: I0320 00:07:14.697191 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:14 crc kubenswrapper[4867]: I0320 00:07:14.698674 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:14 crc kubenswrapper[4867]: I0320 00:07:14.698707 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:14 crc kubenswrapper[4867]: I0320 00:07:14.698718 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:14 crc kubenswrapper[4867]: I0320 00:07:14.698741 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 00:07:14 crc kubenswrapper[4867]: E0320 00:07:14.705708 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 00:07:14 crc kubenswrapper[4867]: E0320 00:07:14.706022 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 00:07:15 crc kubenswrapper[4867]: I0320 00:07:15.339919 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:15 crc kubenswrapper[4867]: W0320 00:07:15.917021 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 00:07:15 crc kubenswrapper[4867]: E0320 00:07:15.917094 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 00:07:16 crc kubenswrapper[4867]: I0320 00:07:16.340144 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:16 crc kubenswrapper[4867]: E0320 00:07:16.455347 4867 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 00:07:17 crc kubenswrapper[4867]: I0320 00:07:17.339754 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:17 crc kubenswrapper[4867]: W0320 00:07:17.774200 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:17 crc kubenswrapper[4867]: E0320 00:07:17.774245 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 00:07:18 crc kubenswrapper[4867]: I0320 00:07:18.338800 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:19 crc kubenswrapper[4867]: I0320 00:07:19.339582 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:19 crc kubenswrapper[4867]: W0320 00:07:19.547143 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 00:07:19 crc kubenswrapper[4867]: E0320 00:07:19.547270 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 00:07:20 crc kubenswrapper[4867]: I0320 00:07:20.339658 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:21 crc kubenswrapper[4867]: I0320 00:07:21.339806 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:21 crc kubenswrapper[4867]: I0320 00:07:21.421113 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:21 crc kubenswrapper[4867]: I0320 00:07:21.422809 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:21 crc kubenswrapper[4867]: I0320 00:07:21.422861 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:21 crc kubenswrapper[4867]: I0320 00:07:21.422880 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:21 crc kubenswrapper[4867]: I0320 00:07:21.423667 4867 scope.go:117] "RemoveContainer" containerID="1e26a8343f24ad423b557e412deec257f373c763e85d3ca466d43704e73de37c" Mar 20 00:07:21 crc kubenswrapper[4867]: E0320 00:07:21.423936 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 00:07:21 crc kubenswrapper[4867]: I0320 00:07:21.705810 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:21 crc kubenswrapper[4867]: I0320 00:07:21.707577 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:21 crc kubenswrapper[4867]: I0320 00:07:21.707629 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:21 crc kubenswrapper[4867]: I0320 00:07:21.707648 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:21 crc kubenswrapper[4867]: I0320 00:07:21.707678 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 00:07:21 crc kubenswrapper[4867]: E0320 00:07:21.712530 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 00:07:21 crc kubenswrapper[4867]: E0320 00:07:21.712973 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 00:07:22 crc kubenswrapper[4867]: I0320 00:07:22.339158 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:22 crc kubenswrapper[4867]: I0320 00:07:22.733183 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 00:07:22 crc kubenswrapper[4867]: I0320 00:07:22.733365 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:22 crc kubenswrapper[4867]: I0320 00:07:22.734732 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:22 crc kubenswrapper[4867]: I0320 00:07:22.734799 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:22 crc kubenswrapper[4867]: I0320 00:07:22.734819 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:23 crc kubenswrapper[4867]: I0320 00:07:23.342204 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:24 crc kubenswrapper[4867]: I0320 00:07:24.340304 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:25 crc kubenswrapper[4867]: I0320 00:07:25.339774 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:26 crc kubenswrapper[4867]: I0320 00:07:26.340051 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:26 crc kubenswrapper[4867]: E0320 00:07:26.456280 4867 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 00:07:27 crc kubenswrapper[4867]: I0320 00:07:27.338616 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:28 crc kubenswrapper[4867]: I0320 00:07:28.339379 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:28 crc kubenswrapper[4867]: I0320 00:07:28.712955 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:28 crc kubenswrapper[4867]: I0320 00:07:28.715045 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:28 crc kubenswrapper[4867]: I0320 00:07:28.715104 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:28 crc kubenswrapper[4867]: I0320 00:07:28.715121 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:28 crc kubenswrapper[4867]: I0320 00:07:28.715634 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 00:07:28 crc kubenswrapper[4867]: E0320 00:07:28.722625 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 00:07:28 crc kubenswrapper[4867]: E0320 00:07:28.722794 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 00:07:29 crc kubenswrapper[4867]: I0320 00:07:29.341184 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:30 crc kubenswrapper[4867]: I0320 00:07:30.334741 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 00:07:30 crc kubenswrapper[4867]: I0320 00:07:30.898657 4867 csr.go:261] certificate signing request csr-cndvw is approved, waiting to be issued Mar 20 00:07:30 crc kubenswrapper[4867]: I0320 00:07:30.907881 4867 csr.go:257] certificate signing request csr-cndvw is issued Mar 20 00:07:30 crc kubenswrapper[4867]: I0320 00:07:30.964807 4867 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 00:07:31 crc kubenswrapper[4867]: I0320 00:07:31.193045 4867 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 00:07:31 crc kubenswrapper[4867]: I0320 00:07:31.908976 4867 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-06 23:13:21.829946109 +0000 UTC Mar 20 00:07:31 crc kubenswrapper[4867]: I0320 00:07:31.909038 4867 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7031h5m49.920913516s for next certificate rotation Mar 20 00:07:33 crc kubenswrapper[4867]: I0320 00:07:33.421488 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:33 crc kubenswrapper[4867]: I0320 00:07:33.422928 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:33 crc kubenswrapper[4867]: I0320 00:07:33.422995 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:33 crc kubenswrapper[4867]: I0320 00:07:33.423012 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:33 crc kubenswrapper[4867]: I0320 00:07:33.424018 4867 scope.go:117] "RemoveContainer" containerID="1e26a8343f24ad423b557e412deec257f373c763e85d3ca466d43704e73de37c" Mar 20 00:07:33 crc kubenswrapper[4867]: I0320 00:07:33.707431 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 00:07:33 crc kubenswrapper[4867]: I0320 00:07:33.709606 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894"} Mar 20 00:07:33 crc kubenswrapper[4867]: I0320 00:07:33.709739 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:33 crc kubenswrapper[4867]: I0320 00:07:33.710575 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:33 crc kubenswrapper[4867]: I0320 00:07:33.710641 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:33 crc kubenswrapper[4867]: I0320 00:07:33.710658 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:34 crc kubenswrapper[4867]: I0320 00:07:34.712930 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 00:07:34 crc kubenswrapper[4867]: I0320 00:07:34.713559 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 00:07:34 crc kubenswrapper[4867]: I0320 00:07:34.715567 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894" exitCode=255 Mar 20 00:07:34 crc kubenswrapper[4867]: I0320 00:07:34.715609 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894"} Mar 20 00:07:34 crc kubenswrapper[4867]: I0320 00:07:34.715649 4867 scope.go:117] "RemoveContainer" containerID="1e26a8343f24ad423b557e412deec257f373c763e85d3ca466d43704e73de37c" Mar 20 00:07:34 crc kubenswrapper[4867]: I0320 00:07:34.715754 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:34 crc kubenswrapper[4867]: I0320 00:07:34.716539 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:34 crc kubenswrapper[4867]: I0320 00:07:34.716572 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:34 crc kubenswrapper[4867]: I0320 00:07:34.716588 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:34 crc kubenswrapper[4867]: I0320 00:07:34.717322 4867 scope.go:117] "RemoveContainer" containerID="ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894" Mar 20 00:07:34 crc kubenswrapper[4867]: E0320 00:07:34.717540 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.720433 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.722845 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.724082 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.724134 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.724149 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.724263 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.733619 4867 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.733982 4867 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 00:07:35 crc kubenswrapper[4867]: E0320 00:07:35.734030 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.736846 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.736880 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.736891 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.736909 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.736921 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:35Z","lastTransitionTime":"2026-03-20T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:35 crc kubenswrapper[4867]: E0320 00:07:35.747548 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.753965 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.753994 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.754003 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.754013 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.754022 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:35Z","lastTransitionTime":"2026-03-20T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:35 crc kubenswrapper[4867]: E0320 00:07:35.762555 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.770618 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.770674 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.770692 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.770716 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.770733 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:35Z","lastTransitionTime":"2026-03-20T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:35 crc kubenswrapper[4867]: E0320 00:07:35.784706 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.796664 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.796747 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.796777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.796810 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:35 crc kubenswrapper[4867]: I0320 00:07:35.796835 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:35Z","lastTransitionTime":"2026-03-20T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:35 crc kubenswrapper[4867]: E0320 00:07:35.808521 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:35 crc kubenswrapper[4867]: E0320 00:07:35.808687 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 00:07:35 crc kubenswrapper[4867]: E0320 00:07:35.808713 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:35 crc kubenswrapper[4867]: E0320 00:07:35.909834 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:36 crc kubenswrapper[4867]: E0320 00:07:36.010146 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:36 crc kubenswrapper[4867]: E0320 00:07:36.110234 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:36 crc kubenswrapper[4867]: E0320 00:07:36.211087 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:36 crc kubenswrapper[4867]: E0320 00:07:36.311356 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:36 crc kubenswrapper[4867]: E0320 00:07:36.412527 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:36 crc kubenswrapper[4867]: E0320 00:07:36.457467 4867 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 00:07:36 crc kubenswrapper[4867]: E0320 00:07:36.513110 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:36 crc kubenswrapper[4867]: E0320 00:07:36.614188 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:36 crc kubenswrapper[4867]: E0320 00:07:36.715276 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:36 crc kubenswrapper[4867]: E0320 00:07:36.816289 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:36 crc kubenswrapper[4867]: E0320 00:07:36.916768 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:37 crc kubenswrapper[4867]: E0320 00:07:37.017057 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:37 crc kubenswrapper[4867]: E0320 00:07:37.117987 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:37 crc kubenswrapper[4867]: E0320 00:07:37.219063 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:37 crc kubenswrapper[4867]: E0320 00:07:37.320070 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:37 crc kubenswrapper[4867]: E0320 00:07:37.420202 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:37 crc kubenswrapper[4867]: I0320 00:07:37.421414 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:37 crc kubenswrapper[4867]: I0320 00:07:37.422763 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:37 crc kubenswrapper[4867]: I0320 00:07:37.422817 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:37 crc kubenswrapper[4867]: I0320 00:07:37.422830 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:37 crc kubenswrapper[4867]: E0320 00:07:37.520768 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:37 crc kubenswrapper[4867]: E0320 00:07:37.621271 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:37 crc kubenswrapper[4867]: E0320 00:07:37.721453 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:37 crc kubenswrapper[4867]: E0320 00:07:37.821722 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:37 crc kubenswrapper[4867]: E0320 00:07:37.922857 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:38 crc kubenswrapper[4867]: E0320 00:07:38.023903 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:38 crc kubenswrapper[4867]: E0320 00:07:38.124565 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:38 crc kubenswrapper[4867]: E0320 00:07:38.225194 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:38 crc kubenswrapper[4867]: E0320 00:07:38.326198 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:38 crc kubenswrapper[4867]: E0320 00:07:38.426722 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:38 crc kubenswrapper[4867]: E0320 00:07:38.527579 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:38 crc kubenswrapper[4867]: E0320 00:07:38.628147 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:38 crc kubenswrapper[4867]: E0320 00:07:38.728455 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:38 crc kubenswrapper[4867]: I0320 00:07:38.790852 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:07:38 crc kubenswrapper[4867]: I0320 00:07:38.791050 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:38 crc kubenswrapper[4867]: I0320 00:07:38.792399 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:38 crc kubenswrapper[4867]: I0320 00:07:38.792436 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:38 crc kubenswrapper[4867]: I0320 00:07:38.792452 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:38 crc kubenswrapper[4867]: I0320 00:07:38.793319 4867 scope.go:117] "RemoveContainer" containerID="ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894" Mar 20 00:07:38 crc kubenswrapper[4867]: E0320 00:07:38.793599 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 00:07:38 crc kubenswrapper[4867]: E0320 00:07:38.829549 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:38 crc kubenswrapper[4867]: E0320 00:07:38.930574 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:39 crc kubenswrapper[4867]: E0320 00:07:39.031609 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:39 crc kubenswrapper[4867]: E0320 00:07:39.131953 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:39 crc kubenswrapper[4867]: E0320 00:07:39.232671 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:39 crc kubenswrapper[4867]: E0320 00:07:39.333116 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:39 crc kubenswrapper[4867]: E0320 00:07:39.433770 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:39 crc kubenswrapper[4867]: I0320 00:07:39.452151 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:07:39 crc kubenswrapper[4867]: E0320 00:07:39.534543 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:39 crc kubenswrapper[4867]: E0320 00:07:39.635564 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:39 crc kubenswrapper[4867]: I0320 00:07:39.732731 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 00:07:39 crc kubenswrapper[4867]: I0320 00:07:39.733984 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:39 crc kubenswrapper[4867]: I0320 00:07:39.734048 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:39 crc kubenswrapper[4867]: I0320 00:07:39.734065 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:39 crc kubenswrapper[4867]: I0320 00:07:39.735122 4867 scope.go:117] "RemoveContainer" containerID="ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894" Mar 20 00:07:39 crc kubenswrapper[4867]: E0320 00:07:39.735454 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 00:07:39 crc kubenswrapper[4867]: E0320 00:07:39.735665 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:39 crc kubenswrapper[4867]: E0320 00:07:39.836002 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:39 crc kubenswrapper[4867]: E0320 00:07:39.937023 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:40 crc kubenswrapper[4867]: E0320 00:07:40.037929 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:40 crc kubenswrapper[4867]: E0320 00:07:40.138146 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:40 crc kubenswrapper[4867]: E0320 00:07:40.238942 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:40 crc kubenswrapper[4867]: E0320 00:07:40.339440 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:40 crc kubenswrapper[4867]: E0320 00:07:40.440038 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:40 crc kubenswrapper[4867]: E0320 00:07:40.540357 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:40 crc kubenswrapper[4867]: E0320 00:07:40.640992 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:40 crc kubenswrapper[4867]: E0320 00:07:40.741767 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:40 crc kubenswrapper[4867]: E0320 00:07:40.842255 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:40 crc kubenswrapper[4867]: E0320 00:07:40.942753 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:41 crc kubenswrapper[4867]: E0320 00:07:41.043838 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:41 crc kubenswrapper[4867]: E0320 00:07:41.145708 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:41 crc kubenswrapper[4867]: E0320 00:07:41.246626 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:41 crc kubenswrapper[4867]: E0320 00:07:41.347354 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:41 crc kubenswrapper[4867]: E0320 00:07:41.447768 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:41 crc kubenswrapper[4867]: E0320 00:07:41.548102 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:41 crc kubenswrapper[4867]: E0320 00:07:41.649200 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:41 crc kubenswrapper[4867]: E0320 00:07:41.749797 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:41 crc kubenswrapper[4867]: E0320 00:07:41.850122 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:41 crc kubenswrapper[4867]: E0320 00:07:41.951068 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:42 crc kubenswrapper[4867]: E0320 00:07:42.051939 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:42 crc kubenswrapper[4867]: E0320 00:07:42.152985 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:42 crc kubenswrapper[4867]: E0320 00:07:42.254334 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:42 crc kubenswrapper[4867]: E0320 00:07:42.354552 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:42 crc kubenswrapper[4867]: E0320 00:07:42.455392 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:42 crc kubenswrapper[4867]: E0320 00:07:42.556523 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:42 crc kubenswrapper[4867]: E0320 00:07:42.656634 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:42 crc kubenswrapper[4867]: E0320 00:07:42.756758 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:42 crc kubenswrapper[4867]: E0320 00:07:42.857075 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:42 crc kubenswrapper[4867]: E0320 00:07:42.957200 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:43 crc kubenswrapper[4867]: E0320 00:07:43.057382 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:43 crc kubenswrapper[4867]: E0320 00:07:43.158394 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:43 crc kubenswrapper[4867]: E0320 00:07:43.259349 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:43 crc kubenswrapper[4867]: E0320 00:07:43.359624 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:43 crc kubenswrapper[4867]: E0320 00:07:43.460010 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:43 crc kubenswrapper[4867]: E0320 00:07:43.560602 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:43 crc kubenswrapper[4867]: E0320 00:07:43.661363 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:43 crc kubenswrapper[4867]: E0320 00:07:43.762522 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:43 crc kubenswrapper[4867]: E0320 00:07:43.862976 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:43 crc kubenswrapper[4867]: E0320 00:07:43.964127 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:44 crc kubenswrapper[4867]: E0320 00:07:44.065227 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:44 crc kubenswrapper[4867]: E0320 00:07:44.165413 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:44 crc kubenswrapper[4867]: E0320 00:07:44.266186 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:44 crc kubenswrapper[4867]: E0320 00:07:44.366592 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:44 crc kubenswrapper[4867]: E0320 00:07:44.466814 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:44 crc kubenswrapper[4867]: E0320 00:07:44.567622 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:44 crc kubenswrapper[4867]: E0320 00:07:44.668534 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:44 crc kubenswrapper[4867]: E0320 00:07:44.769428 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:44 crc kubenswrapper[4867]: E0320 00:07:44.869857 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:44 crc kubenswrapper[4867]: E0320 00:07:44.970401 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:45 crc kubenswrapper[4867]: E0320 00:07:45.070964 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:45 crc kubenswrapper[4867]: E0320 00:07:45.171354 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:45 crc kubenswrapper[4867]: E0320 00:07:45.271924 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:45 crc kubenswrapper[4867]: E0320 00:07:45.372268 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:45 crc kubenswrapper[4867]: E0320 00:07:45.473480 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:45 crc kubenswrapper[4867]: E0320 00:07:45.574064 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:45 crc kubenswrapper[4867]: E0320 00:07:45.674965 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:45 crc kubenswrapper[4867]: E0320 00:07:45.775095 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:45 crc kubenswrapper[4867]: E0320 00:07:45.876066 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:45 crc kubenswrapper[4867]: E0320 00:07:45.947098 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 00:07:45 crc kubenswrapper[4867]: I0320 00:07:45.952151 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:45 crc kubenswrapper[4867]: I0320 00:07:45.952210 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:45 crc kubenswrapper[4867]: I0320 00:07:45.952276 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:45 crc kubenswrapper[4867]: I0320 00:07:45.952831 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:45 crc kubenswrapper[4867]: I0320 00:07:45.952898 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:45Z","lastTransitionTime":"2026-03-20T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:45 crc kubenswrapper[4867]: E0320 00:07:45.969901 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:45 crc kubenswrapper[4867]: I0320 00:07:45.981866 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:45 crc kubenswrapper[4867]: I0320 00:07:45.981952 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:45 crc kubenswrapper[4867]: I0320 00:07:45.981978 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:45 crc kubenswrapper[4867]: I0320 00:07:45.982012 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:45 crc kubenswrapper[4867]: I0320 00:07:45.982038 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:45Z","lastTransitionTime":"2026-03-20T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:45 crc kubenswrapper[4867]: E0320 00:07:45.998897 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:46 crc kubenswrapper[4867]: I0320 00:07:46.007839 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:46 crc kubenswrapper[4867]: I0320 00:07:46.007929 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:46 crc kubenswrapper[4867]: I0320 00:07:46.007958 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:46 crc kubenswrapper[4867]: I0320 00:07:46.007986 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:46 crc kubenswrapper[4867]: I0320 00:07:46.008005 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:46Z","lastTransitionTime":"2026-03-20T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:46 crc kubenswrapper[4867]: E0320 00:07:46.022104 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:46 crc kubenswrapper[4867]: I0320 00:07:46.031887 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:46 crc kubenswrapper[4867]: I0320 00:07:46.031954 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:46 crc kubenswrapper[4867]: I0320 00:07:46.031978 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:46 crc kubenswrapper[4867]: I0320 00:07:46.032007 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:46 crc kubenswrapper[4867]: I0320 00:07:46.032030 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:46Z","lastTransitionTime":"2026-03-20T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:46 crc kubenswrapper[4867]: E0320 00:07:46.044628 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:46 crc kubenswrapper[4867]: E0320 00:07:46.044906 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 00:07:46 crc kubenswrapper[4867]: E0320 00:07:46.044947 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:46 crc kubenswrapper[4867]: E0320 00:07:46.145353 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:46 crc kubenswrapper[4867]: E0320 00:07:46.245767 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:46 crc kubenswrapper[4867]: E0320 00:07:46.347027 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:46 crc kubenswrapper[4867]: E0320 00:07:46.447923 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:46 crc kubenswrapper[4867]: E0320 00:07:46.458823 4867 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 00:07:46 crc kubenswrapper[4867]: E0320 00:07:46.548434 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:46 crc kubenswrapper[4867]: E0320 00:07:46.648970 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:46 crc kubenswrapper[4867]: E0320 00:07:46.749069 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:46 crc kubenswrapper[4867]: E0320 00:07:46.850365 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:46 crc kubenswrapper[4867]: E0320 00:07:46.950981 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:47 crc kubenswrapper[4867]: E0320 00:07:47.052251 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:47 crc kubenswrapper[4867]: E0320 00:07:47.152407 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:47 crc kubenswrapper[4867]: E0320 00:07:47.253591 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:47 crc kubenswrapper[4867]: E0320 00:07:47.354675 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:47 crc kubenswrapper[4867]: E0320 00:07:47.455441 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:47 crc kubenswrapper[4867]: E0320 00:07:47.556013 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:47 crc kubenswrapper[4867]: E0320 00:07:47.656737 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:47 crc kubenswrapper[4867]: E0320 00:07:47.757345 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:47 crc kubenswrapper[4867]: E0320 00:07:47.857815 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:47 crc kubenswrapper[4867]: E0320 00:07:47.959124 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.030884 4867 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.062471 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.062564 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.062592 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.062622 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.062645 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:48Z","lastTransitionTime":"2026-03-20T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.165935 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.165993 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.166010 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.166037 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.166057 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:48Z","lastTransitionTime":"2026-03-20T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.268822 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.268886 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.268909 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.268934 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.268954 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:48Z","lastTransitionTime":"2026-03-20T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.369306 4867 apiserver.go:52] "Watching apiserver" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.377646 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.377704 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.377725 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.377760 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.377780 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:48Z","lastTransitionTime":"2026-03-20T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.383038 4867 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.383885 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-zgbkt","openshift-multus/network-metrics-daemon-rkq8h","openshift-multus/multus-additional-cni-plugins-dfc6c","openshift-machine-config-operator/machine-config-daemon-v9vbm","openshift-multus/multus-98n2n","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97","openshift-image-registry/node-ca-2xwxb","openshift-network-operator/iptables-alerter-4ln5h","openshift-ovn-kubernetes/ovnkube-node-5zkft","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.384411 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.384678 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.384683 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.384994 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.385040 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.385141 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.385545 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.385875 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2xwxb" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.385955 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.386587 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.387091 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.387239 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.387241 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zgbkt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.387265 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.387444 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.387474 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.387945 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.388250 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.388540 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.388728 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.388819 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.392881 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.393241 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.393535 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.393782 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.394013 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.394312 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.394945 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.395168 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.395528 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.396127 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.396326 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.396453 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.396970 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.397158 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.397344 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.397493 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.397682 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.397997 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.398128 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.398366 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.398653 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.398794 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.399145 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.399340 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.399523 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.399701 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.399888 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.400588 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.400729 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.400921 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.401959 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.402372 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.402723 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.403378 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.420793 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.436834 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.446405 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.452632 4867 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.467352 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.476560 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.476615 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.476777 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.476840 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.476877 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.476909 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.476945 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.476978 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.476983 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.477020 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.477078 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.477150 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.477186 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.477245 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.477280 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.477347 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.477361 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.477406 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.477439 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.477512 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.477573 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.477569 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.477606 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.477774 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.477815 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.477819 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.477837 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478026 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478080 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478106 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478137 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478190 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478234 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478245 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478277 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478293 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478321 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478328 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478369 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478420 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478462 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478550 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478597 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478645 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478678 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478696 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478708 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478757 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478741 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478840 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478843 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478832 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.478872 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.479077 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.479118 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.479183 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.479242 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.479281 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.479295 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.479347 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.479397 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.479443 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.479530 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.479582 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.479630 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.479677 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.479755 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.479886 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.480056 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.480425 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.480486 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.480479 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.480701 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.480765 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.481021 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.481187 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.481347 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.481449 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.481707 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.481760 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.481809 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.481859 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.481906 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.481953 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482002 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482052 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482104 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482158 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482206 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482252 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482298 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482351 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482348 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482400 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482457 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482556 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482614 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482673 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482719 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.485515 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.485566 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.485894 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.483486 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.486410 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.486436 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.486454 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.486466 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:48Z","lastTransitionTime":"2026-03-20T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.480783 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.481052 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.481451 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.481858 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.481934 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.481967 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482044 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482339 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482386 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.482435 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.483250 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.483340 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.483389 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.483608 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.485259 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.485341 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.485385 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.485690 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:07:48.985653863 +0000 UTC m=+83.212191480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487397 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487425 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487453 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487516 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487540 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487562 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487585 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487611 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487638 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487690 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487726 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487758 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487800 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487824 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487849 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487872 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487898 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487935 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487957 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487979 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488003 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488026 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488033 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488048 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488299 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488344 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488384 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488422 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488461 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488541 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488586 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488652 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488688 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488722 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488755 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488790 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488832 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488871 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488909 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488944 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488979 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.489031 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488071 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488284 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.486126 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.489122 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.486183 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.486205 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.486201 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.486287 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.486319 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.486357 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.487221 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488464 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.489188 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.489188 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488651 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488741 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488752 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488754 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488780 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488888 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.486076 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488893 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.488942 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.489019 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.489375 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.489436 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.489650 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.489684 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.489808 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.490303 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.490448 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.490392 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.490769 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.490839 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.490887 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.491412 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.491782 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.491800 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.491920 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.491879 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.492007 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.492108 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.493227 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.493275 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.493299 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.493449 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.493546 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.493644 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.493679 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.493706 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.493748 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.493791 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.493828 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.493892 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.493948 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.493985 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494017 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494050 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494084 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494115 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494151 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494183 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494216 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494250 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494283 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494316 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494351 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494385 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.493841 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494469 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494479 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494416 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494699 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494759 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494799 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494140 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494832 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494865 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494896 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494930 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494963 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494997 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495029 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495063 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495100 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495134 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495173 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495204 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495235 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495269 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495305 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495339 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495373 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495414 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495462 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495559 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495600 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495639 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495686 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495736 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495772 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495811 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495850 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495891 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495932 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495970 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496003 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496037 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496071 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496104 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496137 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496172 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496220 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496260 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496296 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496329 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496364 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496411 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496463 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496540 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494127 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496595 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494243 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496603 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494876 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.494888 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495257 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495314 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495324 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496673 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495632 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495828 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496901 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496655 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496950 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.497002 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.497053 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.497106 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.497373 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495841 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.497281 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.497626 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.497863 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.495988 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496097 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496251 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496372 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496545 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.496627 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.497540 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.497648 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.497926 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.498260 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.498858 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-system-cni-dir\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.498888 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.498902 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.499131 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.499189 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-etc-kubernetes\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.499218 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.499635 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.499468 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bnpt\" (UniqueName: \"kubernetes.io/projected/00eacbd3-d921-414b-8b8d-c4298bdd5a28-kube-api-access-5bnpt\") pod \"machine-config-daemon-v9vbm\" (UID: \"00eacbd3-d921-414b-8b8d-c4298bdd5a28\") " pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.500018 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.500136 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.500338 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.500600 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.500903 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.501293 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-run-openvswitch\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.501385 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-run-ovn\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.501420 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f1af2033-700e-4f63-939d-b7132a1e5b5f-env-overrides\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.501454 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.501585 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.501754 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.501764 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.501794 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.501800 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.501879 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.501897 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.501904 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.501485 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.502043 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-host-run-k8s-cni-cncf-io\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.502244 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.502283 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/00eacbd3-d921-414b-8b8d-c4298bdd5a28-rootfs\") pod \"machine-config-daemon-v9vbm\" (UID: \"00eacbd3-d921-414b-8b8d-c4298bdd5a28\") " pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.502364 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.502441 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzbmp\" (UniqueName: \"kubernetes.io/projected/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-kube-api-access-hzbmp\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.502585 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.502645 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xl5d\" (UniqueName: \"kubernetes.io/projected/a4d1c044-4ed7-44b6-9cd0-e52371e17e40-kube-api-access-7xl5d\") pod \"ovnkube-control-plane-749d76644c-tfl97\" (UID: \"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.502704 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-host-run-netns\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.502784 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.502844 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.502930 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.502974 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-os-release\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.503026 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-hostroot\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.503057 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.503200 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.503198 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a4d1c044-4ed7-44b6-9cd0-e52371e17e40-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-tfl97\" (UID: \"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.503322 4867 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.503439 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:49.003407931 +0000 UTC m=+83.229945528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.503794 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.503834 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.503898 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a4d1c044-4ed7-44b6-9cd0-e52371e17e40-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-tfl97\" (UID: \"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.503955 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn55k\" (UniqueName: \"kubernetes.io/projected/0e040dc6-20c6-4d82-b719-bf25fa43db67-kube-api-access-xn55k\") pod \"network-metrics-daemon-rkq8h\" (UID: \"0e040dc6-20c6-4d82-b719-bf25fa43db67\") " pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504008 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-multus-socket-dir-parent\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504050 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/97e52c03-2ca5-4cad-8459-f03029234544-multus-daemon-config\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504087 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-slash\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504127 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c6d1df48-b16a-4691-82bf-68d8cce94a42-serviceca\") pod \"node-ca-2xwxb\" (UID: \"c6d1df48-b16a-4691-82bf-68d8cce94a42\") " pod="openshift-image-registry/node-ca-2xwxb" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504162 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-multus-conf-dir\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504195 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-cni-netd\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504230 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f1af2033-700e-4f63-939d-b7132a1e5b5f-ovnkube-config\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504260 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504305 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504371 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-multus-cni-dir\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504408 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97e52c03-2ca5-4cad-8459-f03029234544-cni-binary-copy\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504444 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-host-var-lib-cni-bin\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504478 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-host-var-lib-kubelet\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504548 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f1af2033-700e-4f63-939d-b7132a1e5b5f-ovnkube-script-lib\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504608 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-var-lib-openvswitch\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504613 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504644 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6d1df48-b16a-4691-82bf-68d8cce94a42-host\") pod \"node-ca-2xwxb\" (UID: \"c6d1df48-b16a-4691-82bf-68d8cce94a42\") " pod="openshift-image-registry/node-ca-2xwxb" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504733 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-system-cni-dir\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504750 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504754 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.504836 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504855 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/02c4bd4e-640d-48b8-8e73-3aead59105b9-hosts-file\") pod \"node-resolver-zgbkt\" (UID: \"02c4bd4e-640d-48b8-8e73-3aead59105b9\") " pod="openshift-dns/node-resolver-zgbkt" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.504934 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:49.004911079 +0000 UTC m=+83.231448676 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.504975 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00eacbd3-d921-414b-8b8d-c4298bdd5a28-proxy-tls\") pod \"machine-config-daemon-v9vbm\" (UID: \"00eacbd3-d921-414b-8b8d-c4298bdd5a28\") " pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.505048 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-kubelet\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.505081 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-systemd-units\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.505160 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.505230 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/00eacbd3-d921-414b-8b8d-c4298bdd5a28-mcd-auth-proxy-config\") pod \"machine-config-daemon-v9vbm\" (UID: \"00eacbd3-d921-414b-8b8d-c4298bdd5a28\") " pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.505782 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-log-socket\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.505827 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.505870 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs\") pod \"network-metrics-daemon-rkq8h\" (UID: \"0e040dc6-20c6-4d82-b719-bf25fa43db67\") " pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.505870 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.506114 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kk54\" (UniqueName: \"kubernetes.io/projected/c6d1df48-b16a-4691-82bf-68d8cce94a42-kube-api-access-6kk54\") pod \"node-ca-2xwxb\" (UID: \"c6d1df48-b16a-4691-82bf-68d8cce94a42\") " pod="openshift-image-registry/node-ca-2xwxb" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.506200 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.506270 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.506634 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.506874 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.506898 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.506954 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.507017 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-etc-openvswitch\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.507060 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.507063 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f1af2033-700e-4f63-939d-b7132a1e5b5f-ovn-node-metrics-cert\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.507131 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.507090 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.507289 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-cni-binary-copy\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.507315 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-node-log\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.507339 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-cni-bin\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.507366 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.507398 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.507444 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.507898 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-host-var-lib-cni-multus\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.507940 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-host-run-multus-certs\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.507971 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.508006 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a4d1c044-4ed7-44b6-9cd0-e52371e17e40-env-overrides\") pod \"ovnkube-control-plane-749d76644c-tfl97\" (UID: \"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.508034 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-cnibin\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.508060 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-run-netns\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.508107 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-cnibin\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.508134 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-os-release\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.508166 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.508195 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff4ht\" (UniqueName: \"kubernetes.io/projected/02c4bd4e-640d-48b8-8e73-3aead59105b9-kube-api-access-ff4ht\") pod \"node-resolver-zgbkt\" (UID: \"02c4bd4e-640d-48b8-8e73-3aead59105b9\") " pod="openshift-dns/node-resolver-zgbkt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.508223 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvzk5\" (UniqueName: \"kubernetes.io/projected/97e52c03-2ca5-4cad-8459-f03029234544-kube-api-access-fvzk5\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.508251 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-run-systemd\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.508275 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68jdx\" (UniqueName: \"kubernetes.io/projected/f1af2033-700e-4f63-939d-b7132a1e5b5f-kube-api-access-68jdx\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.508759 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.508873 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.508997 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.509259 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.509263 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.509266 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.509858 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.509889 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.509903 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510029 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510052 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510080 4867 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510097 4867 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510112 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510125 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510140 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510198 4867 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510239 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510273 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510301 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510327 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510353 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510382 4867 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510410 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510420 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510690 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510581 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510784 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510794 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510436 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510926 4867 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.510976 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511323 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511383 4867 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511398 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511410 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511438 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511454 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511467 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511483 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511519 4867 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511532 4867 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511544 4867 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511556 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511604 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511620 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511633 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511701 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511717 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511729 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511743 4867 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511775 4867 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511790 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511806 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511821 4867 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511833 4867 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511883 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511930 4867 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511947 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511960 4867 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511972 4867 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.511985 4867 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.512016 4867 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.512028 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.512045 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.512113 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.512936 4867 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.512952 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.512966 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.512978 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513013 4867 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513027 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513040 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513055 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513087 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513099 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513112 4867 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513125 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513138 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513171 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.512367 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513238 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513187 4867 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513406 4867 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.512539 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.512553 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513069 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513513 4867 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513544 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513567 4867 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513588 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513607 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513625 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513644 4867 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513663 4867 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513683 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513703 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513720 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513739 4867 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513766 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513773 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513785 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513837 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513863 4867 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513884 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513904 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513923 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513942 4867 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513961 4867 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513980 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513998 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514017 4867 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514034 4867 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514052 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514072 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514090 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514108 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514125 4867 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514143 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514161 4867 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514178 4867 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514196 4867 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514214 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514231 4867 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514249 4867 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514268 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514290 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514308 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514327 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514343 4867 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514360 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514378 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514395 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514412 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514429 4867 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514447 4867 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514464 4867 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514482 4867 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514540 4867 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514558 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514578 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514595 4867 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514615 4867 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514635 4867 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514653 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514671 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514698 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514716 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514733 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514750 4867 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514767 4867 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514785 4867 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514803 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514820 4867 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514838 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514854 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514871 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514889 4867 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514908 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514926 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514945 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514961 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514979 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.514998 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.513070 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.517722 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.517609 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.517929 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.518162 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.522534 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.522718 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.523745 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.523768 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.523817 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.523878 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:49.023858917 +0000 UTC m=+83.250396444 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.524579 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.533637 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.533815 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.533872 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.533883 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.534028 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.534616 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.534677 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.535981 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.536373 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.536454 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.536575 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.536579 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.536705 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.536765 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.536985 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.539010 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.539027 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.539076 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:49.039057252 +0000 UTC m=+83.265594779 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.538243 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.540303 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.542213 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.554027 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.557849 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.558549 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.565205 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.568199 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.569454 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.574199 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.582562 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.588930 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.588987 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.589000 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.589017 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.589031 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:48Z","lastTransitionTime":"2026-03-20T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.590315 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.598427 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.607966 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615333 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-host-run-k8s-cni-cncf-io\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615363 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/00eacbd3-d921-414b-8b8d-c4298bdd5a28-rootfs\") pod \"machine-config-daemon-v9vbm\" (UID: \"00eacbd3-d921-414b-8b8d-c4298bdd5a28\") " pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615381 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-run-openvswitch\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615398 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-run-ovn\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615413 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f1af2033-700e-4f63-939d-b7132a1e5b5f-env-overrides\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615428 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615444 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzbmp\" (UniqueName: \"kubernetes.io/projected/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-kube-api-access-hzbmp\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615461 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xl5d\" (UniqueName: \"kubernetes.io/projected/a4d1c044-4ed7-44b6-9cd0-e52371e17e40-kube-api-access-7xl5d\") pod \"ovnkube-control-plane-749d76644c-tfl97\" (UID: \"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615475 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-host-run-netns\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615476 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-host-run-k8s-cni-cncf-io\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615566 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615573 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615590 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-os-release\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615606 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-hostroot\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615623 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a4d1c044-4ed7-44b6-9cd0-e52371e17e40-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-tfl97\" (UID: \"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615611 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-run-openvswitch\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615638 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a4d1c044-4ed7-44b6-9cd0-e52371e17e40-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-tfl97\" (UID: \"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615679 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn55k\" (UniqueName: \"kubernetes.io/projected/0e040dc6-20c6-4d82-b719-bf25fa43db67-kube-api-access-xn55k\") pod \"network-metrics-daemon-rkq8h\" (UID: \"0e040dc6-20c6-4d82-b719-bf25fa43db67\") " pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615729 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-multus-socket-dir-parent\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615759 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/97e52c03-2ca5-4cad-8459-f03029234544-multus-daemon-config\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.615920 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-run-ovn\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.616088 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-host-run-netns\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.616153 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-os-release\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.616188 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-hostroot\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.616385 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f1af2033-700e-4f63-939d-b7132a1e5b5f-env-overrides\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.616504 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-slash\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.616641 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-multus-socket-dir-parent\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.616939 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a4d1c044-4ed7-44b6-9cd0-e52371e17e40-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-tfl97\" (UID: \"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.617208 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/00eacbd3-d921-414b-8b8d-c4298bdd5a28-rootfs\") pod \"machine-config-daemon-v9vbm\" (UID: \"00eacbd3-d921-414b-8b8d-c4298bdd5a28\") " pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.617396 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-slash\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.617622 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c6d1df48-b16a-4691-82bf-68d8cce94a42-serviceca\") pod \"node-ca-2xwxb\" (UID: \"c6d1df48-b16a-4691-82bf-68d8cce94a42\") " pod="openshift-image-registry/node-ca-2xwxb" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.618003 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-multus-cni-dir\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.617987 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.618329 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97e52c03-2ca5-4cad-8459-f03029234544-cni-binary-copy\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.619043 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-host-var-lib-cni-bin\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.619292 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-host-var-lib-kubelet\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.619384 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/97e52c03-2ca5-4cad-8459-f03029234544-multus-daemon-config\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.619467 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.619528 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-multus-conf-dir\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.619553 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-cni-netd\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.619580 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-host-var-lib-cni-bin\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.619637 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-multus-conf-dir\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.619648 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-cni-netd\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.619573 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f1af2033-700e-4f63-939d-b7132a1e5b5f-ovnkube-config\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.619725 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f1af2033-700e-4f63-939d-b7132a1e5b5f-ovnkube-script-lib\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.619874 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-host-var-lib-kubelet\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.619729 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-multus-cni-dir\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.619991 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/02c4bd4e-640d-48b8-8e73-3aead59105b9-hosts-file\") pod \"node-resolver-zgbkt\" (UID: \"02c4bd4e-640d-48b8-8e73-3aead59105b9\") " pod="openshift-dns/node-resolver-zgbkt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620136 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00eacbd3-d921-414b-8b8d-c4298bdd5a28-proxy-tls\") pod \"machine-config-daemon-v9vbm\" (UID: \"00eacbd3-d921-414b-8b8d-c4298bdd5a28\") " pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620160 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-kubelet\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620234 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/02c4bd4e-640d-48b8-8e73-3aead59105b9-hosts-file\") pod \"node-resolver-zgbkt\" (UID: \"02c4bd4e-640d-48b8-8e73-3aead59105b9\") " pod="openshift-dns/node-resolver-zgbkt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620009 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97e52c03-2ca5-4cad-8459-f03029234544-cni-binary-copy\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620307 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-systemd-units\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620333 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-var-lib-openvswitch\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620466 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6d1df48-b16a-4691-82bf-68d8cce94a42-host\") pod \"node-ca-2xwxb\" (UID: \"c6d1df48-b16a-4691-82bf-68d8cce94a42\") " pod="openshift-image-registry/node-ca-2xwxb" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620530 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-system-cni-dir\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620540 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f1af2033-700e-4f63-939d-b7132a1e5b5f-ovnkube-config\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620558 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620591 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-var-lib-openvswitch\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620602 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c6d1df48-b16a-4691-82bf-68d8cce94a42-serviceca\") pod \"node-ca-2xwxb\" (UID: \"c6d1df48-b16a-4691-82bf-68d8cce94a42\") " pod="openshift-image-registry/node-ca-2xwxb" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620618 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-kubelet\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620640 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-systemd-units\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620664 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-system-cni-dir\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620676 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6d1df48-b16a-4691-82bf-68d8cce94a42-host\") pod \"node-ca-2xwxb\" (UID: \"c6d1df48-b16a-4691-82bf-68d8cce94a42\") " pod="openshift-image-registry/node-ca-2xwxb" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620678 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620698 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/00eacbd3-d921-414b-8b8d-c4298bdd5a28-mcd-auth-proxy-config\") pod \"machine-config-daemon-v9vbm\" (UID: \"00eacbd3-d921-414b-8b8d-c4298bdd5a28\") " pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620728 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-log-socket\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620751 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs\") pod \"network-metrics-daemon-rkq8h\" (UID: \"0e040dc6-20c6-4d82-b719-bf25fa43db67\") " pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620786 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-log-socket\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.620931 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620956 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-etc-openvswitch\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.620975 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs podName:0e040dc6-20c6-4d82-b719-bf25fa43db67 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:49.120960606 +0000 UTC m=+83.347498133 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs") pod "network-metrics-daemon-rkq8h" (UID: "0e040dc6-20c6-4d82-b719-bf25fa43db67") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.620996 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f1af2033-700e-4f63-939d-b7132a1e5b5f-ovn-node-metrics-cert\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621008 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-etc-openvswitch\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621034 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kk54\" (UniqueName: \"kubernetes.io/projected/c6d1df48-b16a-4691-82bf-68d8cce94a42-kube-api-access-6kk54\") pod \"node-ca-2xwxb\" (UID: \"c6d1df48-b16a-4691-82bf-68d8cce94a42\") " pod="openshift-image-registry/node-ca-2xwxb" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621058 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621086 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-cni-binary-copy\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621121 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-host-var-lib-cni-multus\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621142 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-node-log\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621164 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-cni-bin\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621188 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-host-run-multus-certs\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621210 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621232 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a4d1c044-4ed7-44b6-9cd0-e52371e17e40-env-overrides\") pod \"ovnkube-control-plane-749d76644c-tfl97\" (UID: \"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621251 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-cnibin\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621272 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-run-netns\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621292 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-cnibin\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621315 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff4ht\" (UniqueName: \"kubernetes.io/projected/02c4bd4e-640d-48b8-8e73-3aead59105b9-kube-api-access-ff4ht\") pod \"node-resolver-zgbkt\" (UID: \"02c4bd4e-640d-48b8-8e73-3aead59105b9\") " pod="openshift-dns/node-resolver-zgbkt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621341 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/00eacbd3-d921-414b-8b8d-c4298bdd5a28-mcd-auth-proxy-config\") pod \"machine-config-daemon-v9vbm\" (UID: \"00eacbd3-d921-414b-8b8d-c4298bdd5a28\") " pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621341 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvzk5\" (UniqueName: \"kubernetes.io/projected/97e52c03-2ca5-4cad-8459-f03029234544-kube-api-access-fvzk5\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621405 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-run-systemd\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621466 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68jdx\" (UniqueName: \"kubernetes.io/projected/f1af2033-700e-4f63-939d-b7132a1e5b5f-kube-api-access-68jdx\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621507 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-os-release\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621529 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-system-cni-dir\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621550 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-etc-kubernetes\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621758 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621797 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621823 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-host-var-lib-cni-multus\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621839 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-cni-bin\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621848 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-host-run-multus-certs\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621873 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-node-log\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.621897 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-run-netns\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622012 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-cnibin\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622053 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-cnibin\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622073 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-etc-kubernetes\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622074 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bnpt\" (UniqueName: \"kubernetes.io/projected/00eacbd3-d921-414b-8b8d-c4298bdd5a28-kube-api-access-5bnpt\") pod \"machine-config-daemon-v9vbm\" (UID: \"00eacbd3-d921-414b-8b8d-c4298bdd5a28\") " pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622092 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-run-systemd\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622116 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622213 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622232 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-os-release\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622235 4867 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622246 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-cni-binary-copy\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622260 4867 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622270 4867 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622251 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a4d1c044-4ed7-44b6-9cd0-e52371e17e40-env-overrides\") pod \"ovnkube-control-plane-749d76644c-tfl97\" (UID: \"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622270 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622279 4867 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622326 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622341 4867 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622355 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622276 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97e52c03-2ca5-4cad-8459-f03029234544-system-cni-dir\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622372 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622399 4867 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622413 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622425 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622438 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622451 4867 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622464 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622477 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622490 4867 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622528 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622543 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622555 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622567 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622581 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622595 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622607 4867 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622620 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622633 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622645 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622657 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622668 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622682 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622694 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622706 4867 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622718 4867 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622731 4867 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622745 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622757 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622769 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622781 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622794 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622805 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622818 4867 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622830 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622842 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622854 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622867 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622879 4867 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622891 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622903 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622915 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622930 4867 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622942 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622955 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.622968 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.625915 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f1af2033-700e-4f63-939d-b7132a1e5b5f-ovnkube-script-lib\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.629688 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00eacbd3-d921-414b-8b8d-c4298bdd5a28-proxy-tls\") pod \"machine-config-daemon-v9vbm\" (UID: \"00eacbd3-d921-414b-8b8d-c4298bdd5a28\") " pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.630007 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a4d1c044-4ed7-44b6-9cd0-e52371e17e40-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-tfl97\" (UID: \"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.632909 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xl5d\" (UniqueName: \"kubernetes.io/projected/a4d1c044-4ed7-44b6-9cd0-e52371e17e40-kube-api-access-7xl5d\") pod \"ovnkube-control-plane-749d76644c-tfl97\" (UID: \"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.634449 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f1af2033-700e-4f63-939d-b7132a1e5b5f-ovn-node-metrics-cert\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.634982 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.637169 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzbmp\" (UniqueName: \"kubernetes.io/projected/c16597ca-4e52-469a-b6c2-cb2c0d0f07cf-kube-api-access-hzbmp\") pod \"multus-additional-cni-plugins-dfc6c\" (UID: \"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\") " pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.639865 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68jdx\" (UniqueName: \"kubernetes.io/projected/f1af2033-700e-4f63-939d-b7132a1e5b5f-kube-api-access-68jdx\") pod \"ovnkube-node-5zkft\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.643342 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff4ht\" (UniqueName: \"kubernetes.io/projected/02c4bd4e-640d-48b8-8e73-3aead59105b9-kube-api-access-ff4ht\") pod \"node-resolver-zgbkt\" (UID: \"02c4bd4e-640d-48b8-8e73-3aead59105b9\") " pod="openshift-dns/node-resolver-zgbkt" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.644127 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn55k\" (UniqueName: \"kubernetes.io/projected/0e040dc6-20c6-4d82-b719-bf25fa43db67-kube-api-access-xn55k\") pod \"network-metrics-daemon-rkq8h\" (UID: \"0e040dc6-20c6-4d82-b719-bf25fa43db67\") " pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.644892 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kk54\" (UniqueName: \"kubernetes.io/projected/c6d1df48-b16a-4691-82bf-68d8cce94a42-kube-api-access-6kk54\") pod \"node-ca-2xwxb\" (UID: \"c6d1df48-b16a-4691-82bf-68d8cce94a42\") " pod="openshift-image-registry/node-ca-2xwxb" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.649061 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvzk5\" (UniqueName: \"kubernetes.io/projected/97e52c03-2ca5-4cad-8459-f03029234544-kube-api-access-fvzk5\") pod \"multus-98n2n\" (UID: \"97e52c03-2ca5-4cad-8459-f03029234544\") " pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.649620 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.663246 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bnpt\" (UniqueName: \"kubernetes.io/projected/00eacbd3-d921-414b-8b8d-c4298bdd5a28-kube-api-access-5bnpt\") pod \"machine-config-daemon-v9vbm\" (UID: \"00eacbd3-d921-414b-8b8d-c4298bdd5a28\") " pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.691610 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.691872 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.691972 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.692069 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.692157 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:48Z","lastTransitionTime":"2026-03-20T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.720207 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.729857 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.735999 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:07:48 crc kubenswrapper[4867]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 00:07:48 crc kubenswrapper[4867]: set -o allexport Mar 20 00:07:48 crc kubenswrapper[4867]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 00:07:48 crc kubenswrapper[4867]: source /etc/kubernetes/apiserver-url.env Mar 20 00:07:48 crc kubenswrapper[4867]: else Mar 20 00:07:48 crc kubenswrapper[4867]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 00:07:48 crc kubenswrapper[4867]: exit 1 Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 00:07:48 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:07:48 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.737594 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 00:07:48 crc kubenswrapper[4867]: W0320 00:07:48.739041 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-6256d05ac6a8893c2d24c92fe4971857175f819f0162adffa575555df60dcb6f WatchSource:0}: Error finding container 6256d05ac6a8893c2d24c92fe4971857175f819f0162adffa575555df60dcb6f: Status 404 returned error can't find the container with id 6256d05ac6a8893c2d24c92fe4971857175f819f0162adffa575555df60dcb6f Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.741299 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:07:48 crc kubenswrapper[4867]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 00:07:48 crc kubenswrapper[4867]: if [[ -f "/env/_master" ]]; then Mar 20 00:07:48 crc kubenswrapper[4867]: set -o allexport Mar 20 00:07:48 crc kubenswrapper[4867]: source "/env/_master" Mar 20 00:07:48 crc kubenswrapper[4867]: set +o allexport Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 00:07:48 crc kubenswrapper[4867]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 00:07:48 crc kubenswrapper[4867]: ho_enable="--enable-hybrid-overlay" Mar 20 00:07:48 crc kubenswrapper[4867]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 00:07:48 crc kubenswrapper[4867]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 00:07:48 crc kubenswrapper[4867]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 00:07:48 crc kubenswrapper[4867]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 00:07:48 crc kubenswrapper[4867]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 00:07:48 crc kubenswrapper[4867]: --webhook-host=127.0.0.1 \ Mar 20 00:07:48 crc kubenswrapper[4867]: --webhook-port=9743 \ Mar 20 00:07:48 crc kubenswrapper[4867]: ${ho_enable} \ Mar 20 00:07:48 crc kubenswrapper[4867]: --enable-interconnect \ Mar 20 00:07:48 crc kubenswrapper[4867]: --disable-approver \ Mar 20 00:07:48 crc kubenswrapper[4867]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 00:07:48 crc kubenswrapper[4867]: --wait-for-kubernetes-api=200s \ Mar 20 00:07:48 crc kubenswrapper[4867]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 00:07:48 crc kubenswrapper[4867]: --loglevel="${LOGLEVEL}" Mar 20 00:07:48 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:07:48 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.743028 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2xwxb" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.744990 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:07:48 crc kubenswrapper[4867]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 00:07:48 crc kubenswrapper[4867]: if [[ -f "/env/_master" ]]; then Mar 20 00:07:48 crc kubenswrapper[4867]: set -o allexport Mar 20 00:07:48 crc kubenswrapper[4867]: source "/env/_master" Mar 20 00:07:48 crc kubenswrapper[4867]: set +o allexport Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 00:07:48 crc kubenswrapper[4867]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 00:07:48 crc kubenswrapper[4867]: --disable-webhook \ Mar 20 00:07:48 crc kubenswrapper[4867]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 00:07:48 crc kubenswrapper[4867]: --loglevel="${LOGLEVEL}" Mar 20 00:07:48 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:07:48 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.746098 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.754740 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.756602 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6256d05ac6a8893c2d24c92fe4971857175f819f0162adffa575555df60dcb6f"} Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.757650 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6aa296535f18cfb644118a4ee75e4ab11b53c03f78f7c88fe4da27effb2b984c"} Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.760043 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:07:48 crc kubenswrapper[4867]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 00:07:48 crc kubenswrapper[4867]: set -o allexport Mar 20 00:07:48 crc kubenswrapper[4867]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 00:07:48 crc kubenswrapper[4867]: source /etc/kubernetes/apiserver-url.env Mar 20 00:07:48 crc kubenswrapper[4867]: else Mar 20 00:07:48 crc kubenswrapper[4867]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 00:07:48 crc kubenswrapper[4867]: exit 1 Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 00:07:48 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:07:48 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.761180 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 00:07:48 crc kubenswrapper[4867]: W0320 00:07:48.761523 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6d1df48_b16a_4691_82bf_68d8cce94a42.slice/crio-90a038c6bcfc8de4d6f813603357b8f2ccc5693ee473c7b6e86c5f7982addd1c WatchSource:0}: Error finding container 90a038c6bcfc8de4d6f813603357b8f2ccc5693ee473c7b6e86c5f7982addd1c: Status 404 returned error can't find the container with id 90a038c6bcfc8de4d6f813603357b8f2ccc5693ee473c7b6e86c5f7982addd1c Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.761598 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:07:48 crc kubenswrapper[4867]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 00:07:48 crc kubenswrapper[4867]: if [[ -f "/env/_master" ]]; then Mar 20 00:07:48 crc kubenswrapper[4867]: set -o allexport Mar 20 00:07:48 crc kubenswrapper[4867]: source "/env/_master" Mar 20 00:07:48 crc kubenswrapper[4867]: set +o allexport Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 00:07:48 crc kubenswrapper[4867]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 00:07:48 crc kubenswrapper[4867]: ho_enable="--enable-hybrid-overlay" Mar 20 00:07:48 crc kubenswrapper[4867]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 00:07:48 crc kubenswrapper[4867]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 00:07:48 crc kubenswrapper[4867]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 00:07:48 crc kubenswrapper[4867]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 00:07:48 crc kubenswrapper[4867]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 00:07:48 crc kubenswrapper[4867]: --webhook-host=127.0.0.1 \ Mar 20 00:07:48 crc kubenswrapper[4867]: --webhook-port=9743 \ Mar 20 00:07:48 crc kubenswrapper[4867]: ${ho_enable} \ Mar 20 00:07:48 crc kubenswrapper[4867]: --enable-interconnect \ Mar 20 00:07:48 crc kubenswrapper[4867]: --disable-approver \ Mar 20 00:07:48 crc kubenswrapper[4867]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 00:07:48 crc kubenswrapper[4867]: --wait-for-kubernetes-api=200s \ Mar 20 00:07:48 crc kubenswrapper[4867]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 00:07:48 crc kubenswrapper[4867]: --loglevel="${LOGLEVEL}" Mar 20 00:07:48 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:07:48 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.763595 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.763686 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:07:48 crc kubenswrapper[4867]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 00:07:48 crc kubenswrapper[4867]: while [ true ]; Mar 20 00:07:48 crc kubenswrapper[4867]: do Mar 20 00:07:48 crc kubenswrapper[4867]: for f in $(ls /tmp/serviceca); do Mar 20 00:07:48 crc kubenswrapper[4867]: echo $f Mar 20 00:07:48 crc kubenswrapper[4867]: ca_file_path="/tmp/serviceca/${f}" Mar 20 00:07:48 crc kubenswrapper[4867]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 00:07:48 crc kubenswrapper[4867]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 00:07:48 crc kubenswrapper[4867]: if [ -e "${reg_dir_path}" ]; then Mar 20 00:07:48 crc kubenswrapper[4867]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 00:07:48 crc kubenswrapper[4867]: else Mar 20 00:07:48 crc kubenswrapper[4867]: mkdir $reg_dir_path Mar 20 00:07:48 crc kubenswrapper[4867]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: done Mar 20 00:07:48 crc kubenswrapper[4867]: for d in $(ls /etc/docker/certs.d); do Mar 20 00:07:48 crc kubenswrapper[4867]: echo $d Mar 20 00:07:48 crc kubenswrapper[4867]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 00:07:48 crc kubenswrapper[4867]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 00:07:48 crc kubenswrapper[4867]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 00:07:48 crc kubenswrapper[4867]: rm -rf /etc/docker/certs.d/$d Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: done Mar 20 00:07:48 crc kubenswrapper[4867]: sleep 60 & wait ${!} Mar 20 00:07:48 crc kubenswrapper[4867]: done Mar 20 00:07:48 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6kk54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-2xwxb_openshift-image-registry(c6d1df48-b16a-4691-82bf-68d8cce94a42): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:07:48 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.764363 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:07:48 crc kubenswrapper[4867]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 00:07:48 crc kubenswrapper[4867]: if [[ -f "/env/_master" ]]; then Mar 20 00:07:48 crc kubenswrapper[4867]: set -o allexport Mar 20 00:07:48 crc kubenswrapper[4867]: source "/env/_master" Mar 20 00:07:48 crc kubenswrapper[4867]: set +o allexport Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 00:07:48 crc kubenswrapper[4867]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 00:07:48 crc kubenswrapper[4867]: --disable-webhook \ Mar 20 00:07:48 crc kubenswrapper[4867]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 00:07:48 crc kubenswrapper[4867]: --loglevel="${LOGLEVEL}" Mar 20 00:07:48 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:07:48 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.765609 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-2xwxb" podUID="c6d1df48-b16a-4691-82bf-68d8cce94a42" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.766094 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.769003 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: W0320 00:07:48.773681 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-19458a0e639bdb18547728d01cc1f57e55e888a0c790e4b3160e139d43c8e766 WatchSource:0}: Error finding container 19458a0e639bdb18547728d01cc1f57e55e888a0c790e4b3160e139d43c8e766: Status 404 returned error can't find the container with id 19458a0e639bdb18547728d01cc1f57e55e888a0c790e4b3160e139d43c8e766 Mar 20 00:07:48 crc kubenswrapper[4867]: W0320 00:07:48.774684 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc16597ca_4e52_469a_b6c2_cb2c0d0f07cf.slice/crio-47f7c7b8027968f6d93af3fcf396fe76bb1c839e5c7ef708ad112cff9bd37995 WatchSource:0}: Error finding container 47f7c7b8027968f6d93af3fcf396fe76bb1c839e5c7ef708ad112cff9bd37995: Status 404 returned error can't find the container with id 47f7c7b8027968f6d93af3fcf396fe76bb1c839e5c7ef708ad112cff9bd37995 Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.775725 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.776357 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.776673 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzbmp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-dfc6c_openshift-multus(c16597ca-4e52-469a-b6c2-cb2c0d0f07cf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.778012 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" podUID="c16597ca-4e52-469a-b6c2-cb2c0d0f07cf" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.778161 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.779390 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.785195 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: W0320 00:07:48.790290 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4d1c044_4ed7_44b6_9cd0_e52371e17e40.slice/crio-483703eeee435bff657fdbc9e9c166e3f89b82f742d13a97b1d14de911e7465e WatchSource:0}: Error finding container 483703eeee435bff657fdbc9e9c166e3f89b82f742d13a97b1d14de911e7465e: Status 404 returned error can't find the container with id 483703eeee435bff657fdbc9e9c166e3f89b82f742d13a97b1d14de911e7465e Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.793785 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:07:48 crc kubenswrapper[4867]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 20 00:07:48 crc kubenswrapper[4867]: set -euo pipefail Mar 20 00:07:48 crc kubenswrapper[4867]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 20 00:07:48 crc kubenswrapper[4867]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 20 00:07:48 crc kubenswrapper[4867]: # As the secret mount is optional we must wait for the files to be present. Mar 20 00:07:48 crc kubenswrapper[4867]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 20 00:07:48 crc kubenswrapper[4867]: TS=$(date +%s) Mar 20 00:07:48 crc kubenswrapper[4867]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 20 00:07:48 crc kubenswrapper[4867]: HAS_LOGGED_INFO=0 Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: log_missing_certs(){ Mar 20 00:07:48 crc kubenswrapper[4867]: CUR_TS=$(date +%s) Mar 20 00:07:48 crc kubenswrapper[4867]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 20 00:07:48 crc kubenswrapper[4867]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 20 00:07:48 crc kubenswrapper[4867]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 20 00:07:48 crc kubenswrapper[4867]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 20 00:07:48 crc kubenswrapper[4867]: HAS_LOGGED_INFO=1 Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: } Mar 20 00:07:48 crc kubenswrapper[4867]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 20 00:07:48 crc kubenswrapper[4867]: log_missing_certs Mar 20 00:07:48 crc kubenswrapper[4867]: sleep 5 Mar 20 00:07:48 crc kubenswrapper[4867]: done Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 20 00:07:48 crc kubenswrapper[4867]: exec /usr/bin/kube-rbac-proxy \ Mar 20 00:07:48 crc kubenswrapper[4867]: --logtostderr \ Mar 20 00:07:48 crc kubenswrapper[4867]: --secure-listen-address=:9108 \ Mar 20 00:07:48 crc kubenswrapper[4867]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 20 00:07:48 crc kubenswrapper[4867]: --upstream=http://127.0.0.1:29108/ \ Mar 20 00:07:48 crc kubenswrapper[4867]: --tls-private-key-file=${TLS_PK} \ Mar 20 00:07:48 crc kubenswrapper[4867]: --tls-cert-file=${TLS_CERT} Mar 20 00:07:48 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7xl5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-tfl97_openshift-ovn-kubernetes(a4d1c044-4ed7-44b6-9cd0-e52371e17e40): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:07:48 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.794453 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.794476 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.794486 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.794519 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.794530 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:48Z","lastTransitionTime":"2026-03-20T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.794661 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.795157 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zgbkt" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.795961 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:07:48 crc kubenswrapper[4867]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 00:07:48 crc kubenswrapper[4867]: if [[ -f "/env/_master" ]]; then Mar 20 00:07:48 crc kubenswrapper[4867]: set -o allexport Mar 20 00:07:48 crc kubenswrapper[4867]: source "/env/_master" Mar 20 00:07:48 crc kubenswrapper[4867]: set +o allexport Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: ovn_v4_join_subnet_opt= Mar 20 00:07:48 crc kubenswrapper[4867]: if [[ "" != "" ]]; then Mar 20 00:07:48 crc kubenswrapper[4867]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: ovn_v6_join_subnet_opt= Mar 20 00:07:48 crc kubenswrapper[4867]: if [[ "" != "" ]]; then Mar 20 00:07:48 crc kubenswrapper[4867]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: ovn_v4_transit_switch_subnet_opt= Mar 20 00:07:48 crc kubenswrapper[4867]: if [[ "" != "" ]]; then Mar 20 00:07:48 crc kubenswrapper[4867]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: ovn_v6_transit_switch_subnet_opt= Mar 20 00:07:48 crc kubenswrapper[4867]: if [[ "" != "" ]]; then Mar 20 00:07:48 crc kubenswrapper[4867]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: dns_name_resolver_enabled_flag= Mar 20 00:07:48 crc kubenswrapper[4867]: if [[ "false" == "true" ]]; then Mar 20 00:07:48 crc kubenswrapper[4867]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: persistent_ips_enabled_flag= Mar 20 00:07:48 crc kubenswrapper[4867]: if [[ "true" == "true" ]]; then Mar 20 00:07:48 crc kubenswrapper[4867]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: # This is needed so that converting clusters from GA to TP Mar 20 00:07:48 crc kubenswrapper[4867]: # will rollout control plane pods as well Mar 20 00:07:48 crc kubenswrapper[4867]: network_segmentation_enabled_flag= Mar 20 00:07:48 crc kubenswrapper[4867]: multi_network_enabled_flag= Mar 20 00:07:48 crc kubenswrapper[4867]: if [[ "true" == "true" ]]; then Mar 20 00:07:48 crc kubenswrapper[4867]: multi_network_enabled_flag="--enable-multi-network" Mar 20 00:07:48 crc kubenswrapper[4867]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 20 00:07:48 crc kubenswrapper[4867]: exec /usr/bin/ovnkube \ Mar 20 00:07:48 crc kubenswrapper[4867]: --enable-interconnect \ Mar 20 00:07:48 crc kubenswrapper[4867]: --init-cluster-manager "${K8S_NODE}" \ Mar 20 00:07:48 crc kubenswrapper[4867]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 20 00:07:48 crc kubenswrapper[4867]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 20 00:07:48 crc kubenswrapper[4867]: --metrics-bind-address "127.0.0.1:29108" \ Mar 20 00:07:48 crc kubenswrapper[4867]: --metrics-enable-pprof \ Mar 20 00:07:48 crc kubenswrapper[4867]: --metrics-enable-config-duration \ Mar 20 00:07:48 crc kubenswrapper[4867]: ${ovn_v4_join_subnet_opt} \ Mar 20 00:07:48 crc kubenswrapper[4867]: ${ovn_v6_join_subnet_opt} \ Mar 20 00:07:48 crc kubenswrapper[4867]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 20 00:07:48 crc kubenswrapper[4867]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 20 00:07:48 crc kubenswrapper[4867]: ${dns_name_resolver_enabled_flag} \ Mar 20 00:07:48 crc kubenswrapper[4867]: ${persistent_ips_enabled_flag} \ Mar 20 00:07:48 crc kubenswrapper[4867]: ${multi_network_enabled_flag} \ Mar 20 00:07:48 crc kubenswrapper[4867]: ${network_segmentation_enabled_flag} Mar 20 00:07:48 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7xl5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-tfl97_openshift-ovn-kubernetes(a4d1c044-4ed7-44b6-9cd0-e52371e17e40): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:07:48 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.797203 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" podUID="a4d1c044-4ed7-44b6-9cd0-e52371e17e40" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.807037 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: W0320 00:07:48.809961 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02c4bd4e_640d_48b8_8e73_3aead59105b9.slice/crio-cafd042cbee5fbd72ed261b69c379a7b44ef7b8fbba467d3a499d833bfc7ab27 WatchSource:0}: Error finding container cafd042cbee5fbd72ed261b69c379a7b44ef7b8fbba467d3a499d833bfc7ab27: Status 404 returned error can't find the container with id cafd042cbee5fbd72ed261b69c379a7b44ef7b8fbba467d3a499d833bfc7ab27 Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.813253 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:07:48 crc kubenswrapper[4867]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 00:07:48 crc kubenswrapper[4867]: set -uo pipefail Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 00:07:48 crc kubenswrapper[4867]: HOSTS_FILE="/etc/hosts" Mar 20 00:07:48 crc kubenswrapper[4867]: TEMP_FILE="/etc/hosts.tmp" Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: # Make a temporary file with the old hosts file's attributes. Mar 20 00:07:48 crc kubenswrapper[4867]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 00:07:48 crc kubenswrapper[4867]: echo "Failed to preserve hosts file. Exiting." Mar 20 00:07:48 crc kubenswrapper[4867]: exit 1 Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: while true; do Mar 20 00:07:48 crc kubenswrapper[4867]: declare -A svc_ips Mar 20 00:07:48 crc kubenswrapper[4867]: for svc in "${services[@]}"; do Mar 20 00:07:48 crc kubenswrapper[4867]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 00:07:48 crc kubenswrapper[4867]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 00:07:48 crc kubenswrapper[4867]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 00:07:48 crc kubenswrapper[4867]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 00:07:48 crc kubenswrapper[4867]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 00:07:48 crc kubenswrapper[4867]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 00:07:48 crc kubenswrapper[4867]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 00:07:48 crc kubenswrapper[4867]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 00:07:48 crc kubenswrapper[4867]: for i in ${!cmds[*]} Mar 20 00:07:48 crc kubenswrapper[4867]: do Mar 20 00:07:48 crc kubenswrapper[4867]: ips=($(eval "${cmds[i]}")) Mar 20 00:07:48 crc kubenswrapper[4867]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 00:07:48 crc kubenswrapper[4867]: svc_ips["${svc}"]="${ips[@]}" Mar 20 00:07:48 crc kubenswrapper[4867]: break Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: done Mar 20 00:07:48 crc kubenswrapper[4867]: done Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: # Update /etc/hosts only if we get valid service IPs Mar 20 00:07:48 crc kubenswrapper[4867]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 00:07:48 crc kubenswrapper[4867]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 00:07:48 crc kubenswrapper[4867]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 00:07:48 crc kubenswrapper[4867]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 00:07:48 crc kubenswrapper[4867]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 00:07:48 crc kubenswrapper[4867]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 00:07:48 crc kubenswrapper[4867]: sleep 60 & wait Mar 20 00:07:48 crc kubenswrapper[4867]: continue Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: # Append resolver entries for services Mar 20 00:07:48 crc kubenswrapper[4867]: rc=0 Mar 20 00:07:48 crc kubenswrapper[4867]: for svc in "${!svc_ips[@]}"; do Mar 20 00:07:48 crc kubenswrapper[4867]: for ip in ${svc_ips[${svc}]}; do Mar 20 00:07:48 crc kubenswrapper[4867]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 00:07:48 crc kubenswrapper[4867]: done Mar 20 00:07:48 crc kubenswrapper[4867]: done Mar 20 00:07:48 crc kubenswrapper[4867]: if [[ $rc -ne 0 ]]; then Mar 20 00:07:48 crc kubenswrapper[4867]: sleep 60 & wait Mar 20 00:07:48 crc kubenswrapper[4867]: continue Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: Mar 20 00:07:48 crc kubenswrapper[4867]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 00:07:48 crc kubenswrapper[4867]: # Replace /etc/hosts with our modified version if needed Mar 20 00:07:48 crc kubenswrapper[4867]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 00:07:48 crc kubenswrapper[4867]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 00:07:48 crc kubenswrapper[4867]: fi Mar 20 00:07:48 crc kubenswrapper[4867]: sleep 60 & wait Mar 20 00:07:48 crc kubenswrapper[4867]: unset svc_ips Mar 20 00:07:48 crc kubenswrapper[4867]: done Mar 20 00:07:48 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ff4ht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-zgbkt_openshift-dns(02c4bd4e-640d-48b8-8e73-3aead59105b9): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:07:48 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.814518 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-zgbkt" podUID="02c4bd4e-640d-48b8-8e73-3aead59105b9" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.815715 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.830535 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.837269 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-98n2n" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.839453 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.847880 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.849428 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:07:48 crc kubenswrapper[4867]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 00:07:48 crc kubenswrapper[4867]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 00:07:48 crc kubenswrapper[4867]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fvzk5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-98n2n_openshift-multus(97e52c03-2ca5-4cad-8459-f03029234544): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:07:48 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.850606 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-98n2n" podUID="97e52c03-2ca5-4cad-8459-f03029234544" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.857596 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.859163 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.863401 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.864942 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:07:48 crc kubenswrapper[4867]: W0320 00:07:48.873333 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00eacbd3_d921_414b_8b8d_c4298bdd5a28.slice/crio-3d5fb3c17575f9b0f45caaa7a374b56dcf904be1501634756326a2c11e334e2d WatchSource:0}: Error finding container 3d5fb3c17575f9b0f45caaa7a374b56dcf904be1501634756326a2c11e334e2d: Status 404 returned error can't find the container with id 3d5fb3c17575f9b0f45caaa7a374b56dcf904be1501634756326a2c11e334e2d Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.875430 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5bnpt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.878115 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5bnpt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.879319 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.879266 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.879963 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:07:48 crc kubenswrapper[4867]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 00:07:48 crc kubenswrapper[4867]: apiVersion: v1 Mar 20 00:07:48 crc kubenswrapper[4867]: clusters: Mar 20 00:07:48 crc kubenswrapper[4867]: - cluster: Mar 20 00:07:48 crc kubenswrapper[4867]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 00:07:48 crc kubenswrapper[4867]: server: https://api-int.crc.testing:6443 Mar 20 00:07:48 crc kubenswrapper[4867]: name: default-cluster Mar 20 00:07:48 crc kubenswrapper[4867]: contexts: Mar 20 00:07:48 crc kubenswrapper[4867]: - context: Mar 20 00:07:48 crc kubenswrapper[4867]: cluster: default-cluster Mar 20 00:07:48 crc kubenswrapper[4867]: namespace: default Mar 20 00:07:48 crc kubenswrapper[4867]: user: default-auth Mar 20 00:07:48 crc kubenswrapper[4867]: name: default-context Mar 20 00:07:48 crc kubenswrapper[4867]: current-context: default-context Mar 20 00:07:48 crc kubenswrapper[4867]: kind: Config Mar 20 00:07:48 crc kubenswrapper[4867]: preferences: {} Mar 20 00:07:48 crc kubenswrapper[4867]: users: Mar 20 00:07:48 crc kubenswrapper[4867]: - name: default-auth Mar 20 00:07:48 crc kubenswrapper[4867]: user: Mar 20 00:07:48 crc kubenswrapper[4867]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 00:07:48 crc kubenswrapper[4867]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 00:07:48 crc kubenswrapper[4867]: EOF Mar 20 00:07:48 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-68jdx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:07:48 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:07:48 crc kubenswrapper[4867]: E0320 00:07:48.881629 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.888491 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.896289 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.897235 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.897273 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.897283 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.897295 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.897304 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:48Z","lastTransitionTime":"2026-03-20T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.905709 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.913562 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.921483 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.928459 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.939635 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.948084 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.971528 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.981641 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.994074 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.998957 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.998996 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.999008 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.999024 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:48 crc kubenswrapper[4867]: I0320 00:07:48.999037 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:48Z","lastTransitionTime":"2026-03-20T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.019584 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.028059 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.028164 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:07:50.028142626 +0000 UTC m=+84.254680153 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.028208 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.028234 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.028254 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.028364 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.028399 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:50.028389952 +0000 UTC m=+84.254927469 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.028406 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.028440 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:50.028430923 +0000 UTC m=+84.254968440 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.029151 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.029190 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.029204 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.029240 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:50.029231843 +0000 UTC m=+84.255769360 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.032369 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.056187 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.100080 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.101363 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.101423 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.101444 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.101468 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.101484 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:49Z","lastTransitionTime":"2026-03-20T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.129790 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.129916 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs\") pod \"network-metrics-daemon-rkq8h\" (UID: \"0e040dc6-20c6-4d82-b719-bf25fa43db67\") " pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.130075 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.130119 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.130119 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.130134 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.130209 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs podName:0e040dc6-20c6-4d82-b719-bf25fa43db67 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:50.130184646 +0000 UTC m=+84.356722193 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs") pod "network-metrics-daemon-rkq8h" (UID: "0e040dc6-20c6-4d82-b719-bf25fa43db67") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.130249 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:50.130225487 +0000 UTC m=+84.356763124 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.138633 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.178887 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.204115 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.204167 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.204184 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.204206 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.204224 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:49Z","lastTransitionTime":"2026-03-20T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.218040 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.306923 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.306968 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.306984 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.307007 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.307024 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:49Z","lastTransitionTime":"2026-03-20T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.410391 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.410459 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.410476 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.410534 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.410553 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:49Z","lastTransitionTime":"2026-03-20T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.513902 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.514281 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.514299 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.514322 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.514338 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:49Z","lastTransitionTime":"2026-03-20T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.616885 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.616960 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.617040 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.617059 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.617070 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:49Z","lastTransitionTime":"2026-03-20T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.719151 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.719209 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.719225 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.719249 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.719268 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:49Z","lastTransitionTime":"2026-03-20T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.761721 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" event={"ID":"00eacbd3-d921-414b-8b8d-c4298bdd5a28","Type":"ContainerStarted","Data":"3d5fb3c17575f9b0f45caaa7a374b56dcf904be1501634756326a2c11e334e2d"} Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.763008 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerStarted","Data":"a649631439ad5c106e63692b492367a8607628e5e3ef5401365566b3dc2961a5"} Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.764649 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zgbkt" event={"ID":"02c4bd4e-640d-48b8-8e73-3aead59105b9","Type":"ContainerStarted","Data":"cafd042cbee5fbd72ed261b69c379a7b44ef7b8fbba467d3a499d833bfc7ab27"} Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.764898 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:07:49 crc kubenswrapper[4867]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 00:07:49 crc kubenswrapper[4867]: apiVersion: v1 Mar 20 00:07:49 crc kubenswrapper[4867]: clusters: Mar 20 00:07:49 crc kubenswrapper[4867]: - cluster: Mar 20 00:07:49 crc kubenswrapper[4867]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 00:07:49 crc kubenswrapper[4867]: server: https://api-int.crc.testing:6443 Mar 20 00:07:49 crc kubenswrapper[4867]: name: default-cluster Mar 20 00:07:49 crc kubenswrapper[4867]: contexts: Mar 20 00:07:49 crc kubenswrapper[4867]: - context: Mar 20 00:07:49 crc kubenswrapper[4867]: cluster: default-cluster Mar 20 00:07:49 crc kubenswrapper[4867]: namespace: default Mar 20 00:07:49 crc kubenswrapper[4867]: user: default-auth Mar 20 00:07:49 crc kubenswrapper[4867]: name: default-context Mar 20 00:07:49 crc kubenswrapper[4867]: current-context: default-context Mar 20 00:07:49 crc kubenswrapper[4867]: kind: Config Mar 20 00:07:49 crc kubenswrapper[4867]: preferences: {} Mar 20 00:07:49 crc kubenswrapper[4867]: users: Mar 20 00:07:49 crc kubenswrapper[4867]: - name: default-auth Mar 20 00:07:49 crc kubenswrapper[4867]: user: Mar 20 00:07:49 crc kubenswrapper[4867]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 00:07:49 crc kubenswrapper[4867]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 00:07:49 crc kubenswrapper[4867]: EOF Mar 20 00:07:49 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-68jdx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:07:49 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.766006 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:07:49 crc kubenswrapper[4867]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 00:07:49 crc kubenswrapper[4867]: set -uo pipefail Mar 20 00:07:49 crc kubenswrapper[4867]: Mar 20 00:07:49 crc kubenswrapper[4867]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 00:07:49 crc kubenswrapper[4867]: Mar 20 00:07:49 crc kubenswrapper[4867]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 00:07:49 crc kubenswrapper[4867]: HOSTS_FILE="/etc/hosts" Mar 20 00:07:49 crc kubenswrapper[4867]: TEMP_FILE="/etc/hosts.tmp" Mar 20 00:07:49 crc kubenswrapper[4867]: Mar 20 00:07:49 crc kubenswrapper[4867]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 00:07:49 crc kubenswrapper[4867]: Mar 20 00:07:49 crc kubenswrapper[4867]: # Make a temporary file with the old hosts file's attributes. Mar 20 00:07:49 crc kubenswrapper[4867]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 00:07:49 crc kubenswrapper[4867]: echo "Failed to preserve hosts file. Exiting." Mar 20 00:07:49 crc kubenswrapper[4867]: exit 1 Mar 20 00:07:49 crc kubenswrapper[4867]: fi Mar 20 00:07:49 crc kubenswrapper[4867]: Mar 20 00:07:49 crc kubenswrapper[4867]: while true; do Mar 20 00:07:49 crc kubenswrapper[4867]: declare -A svc_ips Mar 20 00:07:49 crc kubenswrapper[4867]: for svc in "${services[@]}"; do Mar 20 00:07:49 crc kubenswrapper[4867]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 00:07:49 crc kubenswrapper[4867]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 00:07:49 crc kubenswrapper[4867]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 00:07:49 crc kubenswrapper[4867]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 00:07:49 crc kubenswrapper[4867]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 00:07:49 crc kubenswrapper[4867]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 00:07:49 crc kubenswrapper[4867]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 00:07:49 crc kubenswrapper[4867]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 00:07:49 crc kubenswrapper[4867]: for i in ${!cmds[*]} Mar 20 00:07:49 crc kubenswrapper[4867]: do Mar 20 00:07:49 crc kubenswrapper[4867]: ips=($(eval "${cmds[i]}")) Mar 20 00:07:49 crc kubenswrapper[4867]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 00:07:49 crc kubenswrapper[4867]: svc_ips["${svc}"]="${ips[@]}" Mar 20 00:07:49 crc kubenswrapper[4867]: break Mar 20 00:07:49 crc kubenswrapper[4867]: fi Mar 20 00:07:49 crc kubenswrapper[4867]: done Mar 20 00:07:49 crc kubenswrapper[4867]: done Mar 20 00:07:49 crc kubenswrapper[4867]: Mar 20 00:07:49 crc kubenswrapper[4867]: # Update /etc/hosts only if we get valid service IPs Mar 20 00:07:49 crc kubenswrapper[4867]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 00:07:49 crc kubenswrapper[4867]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 00:07:49 crc kubenswrapper[4867]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 00:07:49 crc kubenswrapper[4867]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 00:07:49 crc kubenswrapper[4867]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 00:07:49 crc kubenswrapper[4867]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 00:07:49 crc kubenswrapper[4867]: sleep 60 & wait Mar 20 00:07:49 crc kubenswrapper[4867]: continue Mar 20 00:07:49 crc kubenswrapper[4867]: fi Mar 20 00:07:49 crc kubenswrapper[4867]: Mar 20 00:07:49 crc kubenswrapper[4867]: # Append resolver entries for services Mar 20 00:07:49 crc kubenswrapper[4867]: rc=0 Mar 20 00:07:49 crc kubenswrapper[4867]: for svc in "${!svc_ips[@]}"; do Mar 20 00:07:49 crc kubenswrapper[4867]: for ip in ${svc_ips[${svc}]}; do Mar 20 00:07:49 crc kubenswrapper[4867]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 00:07:49 crc kubenswrapper[4867]: done Mar 20 00:07:49 crc kubenswrapper[4867]: done Mar 20 00:07:49 crc kubenswrapper[4867]: if [[ $rc -ne 0 ]]; then Mar 20 00:07:49 crc kubenswrapper[4867]: sleep 60 & wait Mar 20 00:07:49 crc kubenswrapper[4867]: continue Mar 20 00:07:49 crc kubenswrapper[4867]: fi Mar 20 00:07:49 crc kubenswrapper[4867]: Mar 20 00:07:49 crc kubenswrapper[4867]: Mar 20 00:07:49 crc kubenswrapper[4867]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 00:07:49 crc kubenswrapper[4867]: # Replace /etc/hosts with our modified version if needed Mar 20 00:07:49 crc kubenswrapper[4867]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 00:07:49 crc kubenswrapper[4867]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 00:07:49 crc kubenswrapper[4867]: fi Mar 20 00:07:49 crc kubenswrapper[4867]: sleep 60 & wait Mar 20 00:07:49 crc kubenswrapper[4867]: unset svc_ips Mar 20 00:07:49 crc kubenswrapper[4867]: done Mar 20 00:07:49 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ff4ht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-zgbkt_openshift-dns(02c4bd4e-640d-48b8-8e73-3aead59105b9): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:07:49 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.766109 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5bnpt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.766118 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.767113 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" event={"ID":"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf","Type":"ContainerStarted","Data":"47f7c7b8027968f6d93af3fcf396fe76bb1c839e5c7ef708ad112cff9bd37995"} Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.767280 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-zgbkt" podUID="02c4bd4e-640d-48b8-8e73-3aead59105b9" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.769259 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"19458a0e639bdb18547728d01cc1f57e55e888a0c790e4b3160e139d43c8e766"} Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.769244 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5bnpt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.769625 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzbmp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-dfc6c_openshift-multus(c16597ca-4e52-469a-b6c2-cb2c0d0f07cf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.770592 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.770697 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-98n2n" event={"ID":"97e52c03-2ca5-4cad-8459-f03029234544","Type":"ContainerStarted","Data":"86740f663e58f437e63b5add22f6d8b5e2100abad42b3b939410b64fe89621c9"} Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.770747 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" podUID="c16597ca-4e52-469a-b6c2-cb2c0d0f07cf" Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.771889 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.772239 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:07:49 crc kubenswrapper[4867]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 00:07:49 crc kubenswrapper[4867]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 00:07:49 crc kubenswrapper[4867]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fvzk5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-98n2n_openshift-multus(97e52c03-2ca5-4cad-8459-f03029234544): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:07:49 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.772551 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" event={"ID":"a4d1c044-4ed7-44b6-9cd0-e52371e17e40","Type":"ContainerStarted","Data":"483703eeee435bff657fdbc9e9c166e3f89b82f742d13a97b1d14de911e7465e"} Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.773174 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.773427 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-98n2n" podUID="97e52c03-2ca5-4cad-8459-f03029234544" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.773939 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2xwxb" event={"ID":"c6d1df48-b16a-4691-82bf-68d8cce94a42","Type":"ContainerStarted","Data":"90a038c6bcfc8de4d6f813603357b8f2ccc5693ee473c7b6e86c5f7982addd1c"} Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.775005 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:07:49 crc kubenswrapper[4867]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 20 00:07:49 crc kubenswrapper[4867]: set -euo pipefail Mar 20 00:07:49 crc kubenswrapper[4867]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 20 00:07:49 crc kubenswrapper[4867]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 20 00:07:49 crc kubenswrapper[4867]: # As the secret mount is optional we must wait for the files to be present. Mar 20 00:07:49 crc kubenswrapper[4867]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 20 00:07:49 crc kubenswrapper[4867]: TS=$(date +%s) Mar 20 00:07:49 crc kubenswrapper[4867]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 20 00:07:49 crc kubenswrapper[4867]: HAS_LOGGED_INFO=0 Mar 20 00:07:49 crc kubenswrapper[4867]: Mar 20 00:07:49 crc kubenswrapper[4867]: log_missing_certs(){ Mar 20 00:07:49 crc kubenswrapper[4867]: CUR_TS=$(date +%s) Mar 20 00:07:49 crc kubenswrapper[4867]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 20 00:07:49 crc kubenswrapper[4867]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 20 00:07:49 crc kubenswrapper[4867]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 20 00:07:49 crc kubenswrapper[4867]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 20 00:07:49 crc kubenswrapper[4867]: HAS_LOGGED_INFO=1 Mar 20 00:07:49 crc kubenswrapper[4867]: fi Mar 20 00:07:49 crc kubenswrapper[4867]: } Mar 20 00:07:49 crc kubenswrapper[4867]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 20 00:07:49 crc kubenswrapper[4867]: log_missing_certs Mar 20 00:07:49 crc kubenswrapper[4867]: sleep 5 Mar 20 00:07:49 crc kubenswrapper[4867]: done Mar 20 00:07:49 crc kubenswrapper[4867]: Mar 20 00:07:49 crc kubenswrapper[4867]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 20 00:07:49 crc kubenswrapper[4867]: exec /usr/bin/kube-rbac-proxy \ Mar 20 00:07:49 crc kubenswrapper[4867]: --logtostderr \ Mar 20 00:07:49 crc kubenswrapper[4867]: --secure-listen-address=:9108 \ Mar 20 00:07:49 crc kubenswrapper[4867]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 20 00:07:49 crc kubenswrapper[4867]: --upstream=http://127.0.0.1:29108/ \ Mar 20 00:07:49 crc kubenswrapper[4867]: --tls-private-key-file=${TLS_PK} \ Mar 20 00:07:49 crc kubenswrapper[4867]: --tls-cert-file=${TLS_CERT} Mar 20 00:07:49 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7xl5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-tfl97_openshift-ovn-kubernetes(a4d1c044-4ed7-44b6-9cd0-e52371e17e40): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:07:49 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.775072 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:07:49 crc kubenswrapper[4867]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 00:07:49 crc kubenswrapper[4867]: while [ true ]; Mar 20 00:07:49 crc kubenswrapper[4867]: do Mar 20 00:07:49 crc kubenswrapper[4867]: for f in $(ls /tmp/serviceca); do Mar 20 00:07:49 crc kubenswrapper[4867]: echo $f Mar 20 00:07:49 crc kubenswrapper[4867]: ca_file_path="/tmp/serviceca/${f}" Mar 20 00:07:49 crc kubenswrapper[4867]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 00:07:49 crc kubenswrapper[4867]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 00:07:49 crc kubenswrapper[4867]: if [ -e "${reg_dir_path}" ]; then Mar 20 00:07:49 crc kubenswrapper[4867]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 00:07:49 crc kubenswrapper[4867]: else Mar 20 00:07:49 crc kubenswrapper[4867]: mkdir $reg_dir_path Mar 20 00:07:49 crc kubenswrapper[4867]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 00:07:49 crc kubenswrapper[4867]: fi Mar 20 00:07:49 crc kubenswrapper[4867]: done Mar 20 00:07:49 crc kubenswrapper[4867]: for d in $(ls /etc/docker/certs.d); do Mar 20 00:07:49 crc kubenswrapper[4867]: echo $d Mar 20 00:07:49 crc kubenswrapper[4867]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 00:07:49 crc kubenswrapper[4867]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 00:07:49 crc kubenswrapper[4867]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 00:07:49 crc kubenswrapper[4867]: rm -rf /etc/docker/certs.d/$d Mar 20 00:07:49 crc kubenswrapper[4867]: fi Mar 20 00:07:49 crc kubenswrapper[4867]: done Mar 20 00:07:49 crc kubenswrapper[4867]: sleep 60 & wait ${!} Mar 20 00:07:49 crc kubenswrapper[4867]: done Mar 20 00:07:49 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6kk54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-2xwxb_openshift-image-registry(c6d1df48-b16a-4691-82bf-68d8cce94a42): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:07:49 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.776341 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-2xwxb" podUID="c6d1df48-b16a-4691-82bf-68d8cce94a42" Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.777322 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:07:49 crc kubenswrapper[4867]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 00:07:49 crc kubenswrapper[4867]: if [[ -f "/env/_master" ]]; then Mar 20 00:07:49 crc kubenswrapper[4867]: set -o allexport Mar 20 00:07:49 crc kubenswrapper[4867]: source "/env/_master" Mar 20 00:07:49 crc kubenswrapper[4867]: set +o allexport Mar 20 00:07:49 crc kubenswrapper[4867]: fi Mar 20 00:07:49 crc kubenswrapper[4867]: Mar 20 00:07:49 crc kubenswrapper[4867]: ovn_v4_join_subnet_opt= Mar 20 00:07:49 crc kubenswrapper[4867]: if [[ "" != "" ]]; then Mar 20 00:07:49 crc kubenswrapper[4867]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 20 00:07:49 crc kubenswrapper[4867]: fi Mar 20 00:07:49 crc kubenswrapper[4867]: ovn_v6_join_subnet_opt= Mar 20 00:07:49 crc kubenswrapper[4867]: if [[ "" != "" ]]; then Mar 20 00:07:49 crc kubenswrapper[4867]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 20 00:07:49 crc kubenswrapper[4867]: fi Mar 20 00:07:49 crc kubenswrapper[4867]: Mar 20 00:07:49 crc kubenswrapper[4867]: ovn_v4_transit_switch_subnet_opt= Mar 20 00:07:49 crc kubenswrapper[4867]: if [[ "" != "" ]]; then Mar 20 00:07:49 crc kubenswrapper[4867]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 20 00:07:49 crc kubenswrapper[4867]: fi Mar 20 00:07:49 crc kubenswrapper[4867]: ovn_v6_transit_switch_subnet_opt= Mar 20 00:07:49 crc kubenswrapper[4867]: if [[ "" != "" ]]; then Mar 20 00:07:49 crc kubenswrapper[4867]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 20 00:07:49 crc kubenswrapper[4867]: fi Mar 20 00:07:49 crc kubenswrapper[4867]: Mar 20 00:07:49 crc kubenswrapper[4867]: dns_name_resolver_enabled_flag= Mar 20 00:07:49 crc kubenswrapper[4867]: if [[ "false" == "true" ]]; then Mar 20 00:07:49 crc kubenswrapper[4867]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 20 00:07:49 crc kubenswrapper[4867]: fi Mar 20 00:07:49 crc kubenswrapper[4867]: Mar 20 00:07:49 crc kubenswrapper[4867]: persistent_ips_enabled_flag= Mar 20 00:07:49 crc kubenswrapper[4867]: if [[ "true" == "true" ]]; then Mar 20 00:07:49 crc kubenswrapper[4867]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 20 00:07:49 crc kubenswrapper[4867]: fi Mar 20 00:07:49 crc kubenswrapper[4867]: Mar 20 00:07:49 crc kubenswrapper[4867]: # This is needed so that converting clusters from GA to TP Mar 20 00:07:49 crc kubenswrapper[4867]: # will rollout control plane pods as well Mar 20 00:07:49 crc kubenswrapper[4867]: network_segmentation_enabled_flag= Mar 20 00:07:49 crc kubenswrapper[4867]: multi_network_enabled_flag= Mar 20 00:07:49 crc kubenswrapper[4867]: if [[ "true" == "true" ]]; then Mar 20 00:07:49 crc kubenswrapper[4867]: multi_network_enabled_flag="--enable-multi-network" Mar 20 00:07:49 crc kubenswrapper[4867]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 20 00:07:49 crc kubenswrapper[4867]: fi Mar 20 00:07:49 crc kubenswrapper[4867]: Mar 20 00:07:49 crc kubenswrapper[4867]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 20 00:07:49 crc kubenswrapper[4867]: exec /usr/bin/ovnkube \ Mar 20 00:07:49 crc kubenswrapper[4867]: --enable-interconnect \ Mar 20 00:07:49 crc kubenswrapper[4867]: --init-cluster-manager "${K8S_NODE}" \ Mar 20 00:07:49 crc kubenswrapper[4867]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 20 00:07:49 crc kubenswrapper[4867]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 20 00:07:49 crc kubenswrapper[4867]: --metrics-bind-address "127.0.0.1:29108" \ Mar 20 00:07:49 crc kubenswrapper[4867]: --metrics-enable-pprof \ Mar 20 00:07:49 crc kubenswrapper[4867]: --metrics-enable-config-duration \ Mar 20 00:07:49 crc kubenswrapper[4867]: ${ovn_v4_join_subnet_opt} \ Mar 20 00:07:49 crc kubenswrapper[4867]: ${ovn_v6_join_subnet_opt} \ Mar 20 00:07:49 crc kubenswrapper[4867]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 20 00:07:49 crc kubenswrapper[4867]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 20 00:07:49 crc kubenswrapper[4867]: ${dns_name_resolver_enabled_flag} \ Mar 20 00:07:49 crc kubenswrapper[4867]: ${persistent_ips_enabled_flag} \ Mar 20 00:07:49 crc kubenswrapper[4867]: ${multi_network_enabled_flag} \ Mar 20 00:07:49 crc kubenswrapper[4867]: ${network_segmentation_enabled_flag} Mar 20 00:07:49 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7xl5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-tfl97_openshift-ovn-kubernetes(a4d1c044-4ed7-44b6-9cd0-e52371e17e40): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:07:49 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:07:49 crc kubenswrapper[4867]: E0320 00:07:49.778549 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" podUID="a4d1c044-4ed7-44b6-9cd0-e52371e17e40" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.795808 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.809116 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.821658 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.821714 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.821724 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.821941 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.821952 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:49Z","lastTransitionTime":"2026-03-20T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.824036 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.832573 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.842664 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.851032 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.861176 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.875901 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.890792 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.900859 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.915409 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.924548 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.924606 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.924624 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.924652 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.924673 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:49Z","lastTransitionTime":"2026-03-20T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.927087 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.947975 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.967198 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:49 crc kubenswrapper[4867]: I0320 00:07:49.988587 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.003221 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.014057 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.026765 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.026820 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.026841 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.026866 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.026884 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:50Z","lastTransitionTime":"2026-03-20T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.042715 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.042905 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:07:52.042865575 +0000 UTC m=+86.269403122 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.042975 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.043306 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.043341 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.043130 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.043366 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.043449 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.043461 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:52.0434373 +0000 UTC m=+86.269974897 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.043580 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.043697 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.043774 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:52.043752477 +0000 UTC m=+86.270290034 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.043784 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.043901 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:52.0438675 +0000 UTC m=+86.270405107 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.057531 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.066571 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.076529 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.097006 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.129120 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.129157 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.129168 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.129184 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.129195 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:50Z","lastTransitionTime":"2026-03-20T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.138443 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.144875 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs\") pod \"network-metrics-daemon-rkq8h\" (UID: \"0e040dc6-20c6-4d82-b719-bf25fa43db67\") " pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.144918 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.145045 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.145063 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.145076 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.145127 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:52.145111942 +0000 UTC m=+86.371649469 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.145149 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.145243 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs podName:0e040dc6-20c6-4d82-b719-bf25fa43db67 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:52.145214984 +0000 UTC m=+86.371752531 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs") pod "network-metrics-daemon-rkq8h" (UID: "0e040dc6-20c6-4d82-b719-bf25fa43db67") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.183791 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.218719 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.233328 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.233398 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.233415 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.233440 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.233457 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:50Z","lastTransitionTime":"2026-03-20T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.264294 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.299129 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.336005 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.336044 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.336060 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.336083 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.336099 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:50Z","lastTransitionTime":"2026-03-20T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.350719 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.386083 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.421374 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.421603 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.421611 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.421383 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.421591 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.421767 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.421921 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.422915 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:07:50 crc kubenswrapper[4867]: E0320 00:07:50.423055 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.429792 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.430910 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.433157 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.434482 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.436466 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.437841 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.438464 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.438558 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.438576 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.438601 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.438618 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:50Z","lastTransitionTime":"2026-03-20T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.439307 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.441333 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.442713 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.444476 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.445265 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.447369 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.448210 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.449596 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.450948 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.451816 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.453155 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.453977 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.456163 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.457998 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.458806 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.460294 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.461032 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.462561 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.463287 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.464286 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.465957 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.466769 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.468290 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.469369 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.470696 4867 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.470912 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.473266 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.474715 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.476069 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.479587 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.481030 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.483212 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.484761 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.487003 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.487978 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.489823 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.491024 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.492573 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.493435 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.494927 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.495864 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.497697 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.498424 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.499716 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.500435 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.501881 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.502826 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.503593 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.540743 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.540961 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.541053 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.541143 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.541242 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:50Z","lastTransitionTime":"2026-03-20T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.644533 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.644848 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.644934 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.645015 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.645104 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:50Z","lastTransitionTime":"2026-03-20T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.747655 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.747711 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.747728 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.747751 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.747767 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:50Z","lastTransitionTime":"2026-03-20T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.849865 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.849909 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.849925 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.849946 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.849962 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:50Z","lastTransitionTime":"2026-03-20T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.952143 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.952216 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.952239 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.952267 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:50 crc kubenswrapper[4867]: I0320 00:07:50.952298 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:50Z","lastTransitionTime":"2026-03-20T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.056020 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.056085 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.056102 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.056124 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.056142 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:51Z","lastTransitionTime":"2026-03-20T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.159002 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.159074 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.159097 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.159136 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.159162 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:51Z","lastTransitionTime":"2026-03-20T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.261910 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.261960 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.261978 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.262003 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.262020 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:51Z","lastTransitionTime":"2026-03-20T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.366760 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.366830 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.366847 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.366871 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.366888 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:51Z","lastTransitionTime":"2026-03-20T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.474527 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.474603 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.474633 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.474661 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.474681 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:51Z","lastTransitionTime":"2026-03-20T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.577962 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.578026 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.578045 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.578072 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.578090 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:51Z","lastTransitionTime":"2026-03-20T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.680705 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.680770 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.680787 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.680811 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.680828 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:51Z","lastTransitionTime":"2026-03-20T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.783476 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.783553 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.783571 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.783596 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.783613 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:51Z","lastTransitionTime":"2026-03-20T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.887073 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.887131 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.887149 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.887171 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.887188 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:51Z","lastTransitionTime":"2026-03-20T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.989986 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.990058 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.990083 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.990113 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:51 crc kubenswrapper[4867]: I0320 00:07:51.990135 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:51Z","lastTransitionTime":"2026-03-20T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.066156 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.066303 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.066387 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.066429 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.066614 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.066702 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:56.066678387 +0000 UTC m=+90.293215934 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.066718 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.066809 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:07:56.066770179 +0000 UTC m=+90.293307736 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.066861 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:56.066845891 +0000 UTC m=+90.293383448 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.066984 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.067010 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.067029 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.067095 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:56.067070066 +0000 UTC m=+90.293607623 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.092756 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.092841 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.092865 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.092898 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.092920 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:52Z","lastTransitionTime":"2026-03-20T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.167003 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.167134 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs\") pod \"network-metrics-daemon-rkq8h\" (UID: \"0e040dc6-20c6-4d82-b719-bf25fa43db67\") " pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.167294 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.167342 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.167379 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs podName:0e040dc6-20c6-4d82-b719-bf25fa43db67 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:56.167354254 +0000 UTC m=+90.393891801 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs") pod "network-metrics-daemon-rkq8h" (UID: "0e040dc6-20c6-4d82-b719-bf25fa43db67") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.167390 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.167410 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.167538 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 00:07:56.167476397 +0000 UTC m=+90.394014014 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.195513 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.195559 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.195571 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.195588 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.195601 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:52Z","lastTransitionTime":"2026-03-20T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.298237 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.298298 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.298316 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.298342 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.298361 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:52Z","lastTransitionTime":"2026-03-20T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.401978 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.402031 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.402049 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.402073 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.402095 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:52Z","lastTransitionTime":"2026-03-20T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.421624 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.421671 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.421642 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.421789 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.421895 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.422105 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.422537 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.422888 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.436972 4867 scope.go:117] "RemoveContainer" containerID="ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894" Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.437304 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.437666 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.505262 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.505310 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.505327 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.505355 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.505373 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:52Z","lastTransitionTime":"2026-03-20T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.608280 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.608329 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.608346 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.608367 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.608384 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:52Z","lastTransitionTime":"2026-03-20T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.711792 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.712031 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.712048 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.712070 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.712089 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:52Z","lastTransitionTime":"2026-03-20T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.782195 4867 scope.go:117] "RemoveContainer" containerID="ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894" Mar 20 00:07:52 crc kubenswrapper[4867]: E0320 00:07:52.782328 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.814187 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.814222 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.814231 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.814242 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.814254 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:52Z","lastTransitionTime":"2026-03-20T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.915824 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.915858 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.915868 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.915880 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:52 crc kubenswrapper[4867]: I0320 00:07:52.915889 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:52Z","lastTransitionTime":"2026-03-20T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.018146 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.018200 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.018212 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.018227 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.018238 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:53Z","lastTransitionTime":"2026-03-20T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.120809 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.120872 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.120889 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.120912 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.120929 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:53Z","lastTransitionTime":"2026-03-20T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.187236 4867 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.223465 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.223567 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.223593 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.223625 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.223648 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:53Z","lastTransitionTime":"2026-03-20T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.327153 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.327403 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.327522 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.327681 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.327909 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:53Z","lastTransitionTime":"2026-03-20T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.433347 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.433432 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.433456 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.433485 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.433543 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:53Z","lastTransitionTime":"2026-03-20T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.536829 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.536891 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.536909 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.536936 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.536954 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:53Z","lastTransitionTime":"2026-03-20T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.640256 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.640324 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.640342 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.640373 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.640391 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:53Z","lastTransitionTime":"2026-03-20T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.743915 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.743979 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.743997 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.744023 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.744042 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:53Z","lastTransitionTime":"2026-03-20T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.847191 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.847263 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.847291 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.847320 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.847341 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:53Z","lastTransitionTime":"2026-03-20T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.950681 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.950740 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.950757 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.950781 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:53 crc kubenswrapper[4867]: I0320 00:07:53.950801 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:53Z","lastTransitionTime":"2026-03-20T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.053850 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.053916 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.053938 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.053966 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.053985 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:54Z","lastTransitionTime":"2026-03-20T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.156984 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.157042 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.157058 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.157081 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.157097 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:54Z","lastTransitionTime":"2026-03-20T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.259610 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.259674 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.259695 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.259721 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.259740 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:54Z","lastTransitionTime":"2026-03-20T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.363097 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.363146 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.363164 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.363190 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.363207 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:54Z","lastTransitionTime":"2026-03-20T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.421083 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.421134 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.421211 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:07:54 crc kubenswrapper[4867]: E0320 00:07:54.421300 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.421336 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:07:54 crc kubenswrapper[4867]: E0320 00:07:54.421579 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:07:54 crc kubenswrapper[4867]: E0320 00:07:54.422544 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:07:54 crc kubenswrapper[4867]: E0320 00:07:54.422793 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.466640 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.466713 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.466736 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.466768 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.466792 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:54Z","lastTransitionTime":"2026-03-20T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.569783 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.569840 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.569856 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.569878 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.569895 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:54Z","lastTransitionTime":"2026-03-20T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.673153 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.673211 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.673226 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.673249 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.673268 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:54Z","lastTransitionTime":"2026-03-20T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.776552 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.776630 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.776653 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.776683 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.776705 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:54Z","lastTransitionTime":"2026-03-20T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.880223 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.880297 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.880315 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.880341 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.880359 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:54Z","lastTransitionTime":"2026-03-20T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.983872 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.983929 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.983961 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.984003 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:54 crc kubenswrapper[4867]: I0320 00:07:54.984025 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:54Z","lastTransitionTime":"2026-03-20T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.087294 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.087352 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.087368 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.087388 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.087401 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:55Z","lastTransitionTime":"2026-03-20T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.190246 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.190318 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.190340 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.190363 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.190380 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:55Z","lastTransitionTime":"2026-03-20T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.293230 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.293291 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.293311 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.293356 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.293384 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:55Z","lastTransitionTime":"2026-03-20T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.396776 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.396851 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.396877 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.396907 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.396931 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:55Z","lastTransitionTime":"2026-03-20T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.500371 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.500427 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.500444 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.500468 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.500487 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:55Z","lastTransitionTime":"2026-03-20T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.603775 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.603833 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.603850 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.603872 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.603889 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:55Z","lastTransitionTime":"2026-03-20T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.706803 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.706887 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.706916 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.706947 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.706968 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:55Z","lastTransitionTime":"2026-03-20T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.809389 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.809447 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.809464 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.809526 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.809544 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:55Z","lastTransitionTime":"2026-03-20T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.912262 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.912318 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.912329 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.912349 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:55 crc kubenswrapper[4867]: I0320 00:07:55.912361 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:55Z","lastTransitionTime":"2026-03-20T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.015082 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.015156 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.015199 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.015233 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.015256 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:56Z","lastTransitionTime":"2026-03-20T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.051071 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.051142 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.051168 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.051195 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.051230 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:56Z","lastTransitionTime":"2026-03-20T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.067003 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.071849 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.071925 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.071948 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.071972 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.071989 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:56Z","lastTransitionTime":"2026-03-20T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.083697 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.088124 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.088175 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.088238 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.088301 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.088349 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:56Z","lastTransitionTime":"2026-03-20T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.103106 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.107801 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.107921 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.107942 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.107970 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.107989 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:56Z","lastTransitionTime":"2026-03-20T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.113338 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.113481 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.113683 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:08:04.113646775 +0000 UTC m=+98.340184342 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.113695 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.113783 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 00:08:04.113763818 +0000 UTC m=+98.340301365 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.113866 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.113960 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.114084 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.114123 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.114147 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.114224 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 00:08:04.114197488 +0000 UTC m=+98.340735095 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.114352 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.114425 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 00:08:04.114402084 +0000 UTC m=+98.340939631 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.124409 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.129629 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.129700 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.129722 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.129750 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.129773 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:56Z","lastTransitionTime":"2026-03-20T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.145671 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.145889 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.148572 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.155673 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.155723 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.155754 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.155774 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:56Z","lastTransitionTime":"2026-03-20T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.215386 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs\") pod \"network-metrics-daemon-rkq8h\" (UID: \"0e040dc6-20c6-4d82-b719-bf25fa43db67\") " pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.215462 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.215675 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.215701 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.215722 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.215757 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.215798 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 00:08:04.215778698 +0000 UTC m=+98.442316245 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.215864 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs podName:0e040dc6-20c6-4d82-b719-bf25fa43db67 nodeName:}" failed. No retries permitted until 2026-03-20 00:08:04.21583494 +0000 UTC m=+98.442372487 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs") pod "network-metrics-daemon-rkq8h" (UID: "0e040dc6-20c6-4d82-b719-bf25fa43db67") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.258624 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.258680 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.258697 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.258719 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.258740 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:56Z","lastTransitionTime":"2026-03-20T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.362081 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.362188 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.362209 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.362276 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.362299 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:56Z","lastTransitionTime":"2026-03-20T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.423681 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.423692 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.423777 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.425204 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.425534 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.425937 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.426103 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:07:56 crc kubenswrapper[4867]: E0320 00:07:56.426349 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.439185 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.455138 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.468660 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.468696 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.468707 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.468723 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.468735 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:56Z","lastTransitionTime":"2026-03-20T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.472021 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.484262 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.499671 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.512201 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.532934 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.548951 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.559386 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.569586 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.572467 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.572580 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.572609 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.572643 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.572666 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:56Z","lastTransitionTime":"2026-03-20T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.581608 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.599552 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.612032 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.627996 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.641661 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.650804 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.675243 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.675281 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.675298 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.675324 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.675341 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:56Z","lastTransitionTime":"2026-03-20T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.778439 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.778540 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.778564 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.778594 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.778616 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:56Z","lastTransitionTime":"2026-03-20T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.881508 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.881831 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.881913 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.881997 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.882092 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:56Z","lastTransitionTime":"2026-03-20T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.984884 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.984922 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.984935 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.984951 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:56 crc kubenswrapper[4867]: I0320 00:07:56.984963 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:56Z","lastTransitionTime":"2026-03-20T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.086746 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.086802 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.086819 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.086842 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.086859 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:57Z","lastTransitionTime":"2026-03-20T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.189965 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.190046 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.190071 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.190110 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.190129 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:57Z","lastTransitionTime":"2026-03-20T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.293677 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.293750 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.293768 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.293792 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.293810 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:57Z","lastTransitionTime":"2026-03-20T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.396741 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.396805 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.396823 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.396847 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.396867 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:57Z","lastTransitionTime":"2026-03-20T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.500425 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.500581 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.500602 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.500626 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.500642 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:57Z","lastTransitionTime":"2026-03-20T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.602865 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.602937 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.602951 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.602969 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.602981 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:57Z","lastTransitionTime":"2026-03-20T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.705531 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.705618 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.705642 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.705664 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.705715 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:57Z","lastTransitionTime":"2026-03-20T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.808140 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.808205 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.808222 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.808244 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.808261 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:57Z","lastTransitionTime":"2026-03-20T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.915192 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.915226 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.915235 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.915249 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:57 crc kubenswrapper[4867]: I0320 00:07:57.915276 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:57Z","lastTransitionTime":"2026-03-20T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.018252 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.018302 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.018321 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.018346 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.018363 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:58Z","lastTransitionTime":"2026-03-20T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.121357 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.121416 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.121433 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.121461 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.121479 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:58Z","lastTransitionTime":"2026-03-20T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.223823 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.223869 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.223927 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.223954 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.223969 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:58Z","lastTransitionTime":"2026-03-20T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.326796 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.326889 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.326910 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.326932 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.326950 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:58Z","lastTransitionTime":"2026-03-20T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.421121 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.421140 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.421264 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:07:58 crc kubenswrapper[4867]: E0320 00:07:58.421447 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.421535 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:07:58 crc kubenswrapper[4867]: E0320 00:07:58.421600 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:07:58 crc kubenswrapper[4867]: E0320 00:07:58.421822 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:07:58 crc kubenswrapper[4867]: E0320 00:07:58.422281 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.429441 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.429484 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.429510 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.429560 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.429576 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:58Z","lastTransitionTime":"2026-03-20T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.532810 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.532880 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.532900 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.532931 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.532954 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:58Z","lastTransitionTime":"2026-03-20T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.636271 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.636329 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.636347 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.636371 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.636391 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:58Z","lastTransitionTime":"2026-03-20T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.739306 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.739366 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.739383 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.739409 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.739428 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:58Z","lastTransitionTime":"2026-03-20T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.843116 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.843152 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.843164 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.843180 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.843194 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:58Z","lastTransitionTime":"2026-03-20T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.945276 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.945327 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.945348 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.945375 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:58 crc kubenswrapper[4867]: I0320 00:07:58.945394 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:58Z","lastTransitionTime":"2026-03-20T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.047910 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.047975 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.047994 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.048018 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.048036 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:59Z","lastTransitionTime":"2026-03-20T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.150251 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.150302 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.150317 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.150338 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.150354 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:59Z","lastTransitionTime":"2026-03-20T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.252781 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.252812 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.252820 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.252833 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.252841 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:59Z","lastTransitionTime":"2026-03-20T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.354944 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.354975 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.354983 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.354996 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.355005 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:59Z","lastTransitionTime":"2026-03-20T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.457895 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.457944 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.457958 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.457975 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.457988 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:59Z","lastTransitionTime":"2026-03-20T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.560589 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.560623 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.560636 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.560649 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.560659 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:59Z","lastTransitionTime":"2026-03-20T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.663098 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.663138 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.663147 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.663160 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.663171 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:59Z","lastTransitionTime":"2026-03-20T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.766046 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.766096 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.766113 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.766134 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.766151 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:59Z","lastTransitionTime":"2026-03-20T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.868519 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.868555 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.868566 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.868580 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.868591 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:59Z","lastTransitionTime":"2026-03-20T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.970692 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.970745 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.970761 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.970781 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:07:59 crc kubenswrapper[4867]: I0320 00:07:59.970800 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:07:59Z","lastTransitionTime":"2026-03-20T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.073797 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.073849 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.073866 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.073888 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.073908 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:00Z","lastTransitionTime":"2026-03-20T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.176966 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.177036 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.177057 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.177088 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.177109 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:00Z","lastTransitionTime":"2026-03-20T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.279651 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.279694 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.279703 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.279717 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.279726 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:00Z","lastTransitionTime":"2026-03-20T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.382104 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.382137 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.382145 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.382157 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.382166 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:00Z","lastTransitionTime":"2026-03-20T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.420986 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.421017 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.421021 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:00 crc kubenswrapper[4867]: E0320 00:08:00.421164 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.421213 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:00 crc kubenswrapper[4867]: E0320 00:08:00.421296 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:00 crc kubenswrapper[4867]: E0320 00:08:00.421477 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:00 crc kubenswrapper[4867]: E0320 00:08:00.421620 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.484198 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.484683 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.484814 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.484906 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.484990 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:00Z","lastTransitionTime":"2026-03-20T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.588253 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.588308 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.588324 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.588345 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.588361 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:00Z","lastTransitionTime":"2026-03-20T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.690780 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.691098 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.691453 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.691718 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.691955 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:00Z","lastTransitionTime":"2026-03-20T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.706975 4867 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.794373 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.794439 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.794457 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.794482 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.794555 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:00Z","lastTransitionTime":"2026-03-20T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.898517 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.898557 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.898567 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.898582 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:00 crc kubenswrapper[4867]: I0320 00:08:00.898592 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:00Z","lastTransitionTime":"2026-03-20T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.000792 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.000827 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.000835 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.000848 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.000857 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:01Z","lastTransitionTime":"2026-03-20T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.104071 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.104111 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.104123 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.104142 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.104175 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:01Z","lastTransitionTime":"2026-03-20T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.207212 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.207249 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.207257 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.207271 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.207280 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:01Z","lastTransitionTime":"2026-03-20T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.310256 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.310306 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.310323 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.310347 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.310365 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:01Z","lastTransitionTime":"2026-03-20T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.412650 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.412693 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.412703 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.412718 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.412730 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:01Z","lastTransitionTime":"2026-03-20T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:01 crc kubenswrapper[4867]: E0320 00:08:01.423868 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzbmp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-dfc6c_openshift-multus(c16597ca-4e52-469a-b6c2-cb2c0d0f07cf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 00:08:01 crc kubenswrapper[4867]: E0320 00:08:01.423945 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:08:01 crc kubenswrapper[4867]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 00:08:01 crc kubenswrapper[4867]: apiVersion: v1 Mar 20 00:08:01 crc kubenswrapper[4867]: clusters: Mar 20 00:08:01 crc kubenswrapper[4867]: - cluster: Mar 20 00:08:01 crc kubenswrapper[4867]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 00:08:01 crc kubenswrapper[4867]: server: https://api-int.crc.testing:6443 Mar 20 00:08:01 crc kubenswrapper[4867]: name: default-cluster Mar 20 00:08:01 crc kubenswrapper[4867]: contexts: Mar 20 00:08:01 crc kubenswrapper[4867]: - context: Mar 20 00:08:01 crc kubenswrapper[4867]: cluster: default-cluster Mar 20 00:08:01 crc kubenswrapper[4867]: namespace: default Mar 20 00:08:01 crc kubenswrapper[4867]: user: default-auth Mar 20 00:08:01 crc kubenswrapper[4867]: name: default-context Mar 20 00:08:01 crc kubenswrapper[4867]: current-context: default-context Mar 20 00:08:01 crc kubenswrapper[4867]: kind: Config Mar 20 00:08:01 crc kubenswrapper[4867]: preferences: {} Mar 20 00:08:01 crc kubenswrapper[4867]: users: Mar 20 00:08:01 crc kubenswrapper[4867]: - name: default-auth Mar 20 00:08:01 crc kubenswrapper[4867]: user: Mar 20 00:08:01 crc kubenswrapper[4867]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 00:08:01 crc kubenswrapper[4867]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 00:08:01 crc kubenswrapper[4867]: EOF Mar 20 00:08:01 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-68jdx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:08:01 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:08:01 crc kubenswrapper[4867]: E0320 00:08:01.423973 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:08:01 crc kubenswrapper[4867]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 20 00:08:01 crc kubenswrapper[4867]: set -euo pipefail Mar 20 00:08:01 crc kubenswrapper[4867]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 20 00:08:01 crc kubenswrapper[4867]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 20 00:08:01 crc kubenswrapper[4867]: # As the secret mount is optional we must wait for the files to be present. Mar 20 00:08:01 crc kubenswrapper[4867]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 20 00:08:01 crc kubenswrapper[4867]: TS=$(date +%s) Mar 20 00:08:01 crc kubenswrapper[4867]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 20 00:08:01 crc kubenswrapper[4867]: HAS_LOGGED_INFO=0 Mar 20 00:08:01 crc kubenswrapper[4867]: Mar 20 00:08:01 crc kubenswrapper[4867]: log_missing_certs(){ Mar 20 00:08:01 crc kubenswrapper[4867]: CUR_TS=$(date +%s) Mar 20 00:08:01 crc kubenswrapper[4867]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 20 00:08:01 crc kubenswrapper[4867]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 20 00:08:01 crc kubenswrapper[4867]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 20 00:08:01 crc kubenswrapper[4867]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 20 00:08:01 crc kubenswrapper[4867]: HAS_LOGGED_INFO=1 Mar 20 00:08:01 crc kubenswrapper[4867]: fi Mar 20 00:08:01 crc kubenswrapper[4867]: } Mar 20 00:08:01 crc kubenswrapper[4867]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 20 00:08:01 crc kubenswrapper[4867]: log_missing_certs Mar 20 00:08:01 crc kubenswrapper[4867]: sleep 5 Mar 20 00:08:01 crc kubenswrapper[4867]: done Mar 20 00:08:01 crc kubenswrapper[4867]: Mar 20 00:08:01 crc kubenswrapper[4867]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 20 00:08:01 crc kubenswrapper[4867]: exec /usr/bin/kube-rbac-proxy \ Mar 20 00:08:01 crc kubenswrapper[4867]: --logtostderr \ Mar 20 00:08:01 crc kubenswrapper[4867]: --secure-listen-address=:9108 \ Mar 20 00:08:01 crc kubenswrapper[4867]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 20 00:08:01 crc kubenswrapper[4867]: --upstream=http://127.0.0.1:29108/ \ Mar 20 00:08:01 crc kubenswrapper[4867]: --tls-private-key-file=${TLS_PK} \ Mar 20 00:08:01 crc kubenswrapper[4867]: --tls-cert-file=${TLS_CERT} Mar 20 00:08:01 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7xl5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-tfl97_openshift-ovn-kubernetes(a4d1c044-4ed7-44b6-9cd0-e52371e17e40): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:08:01 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:08:01 crc kubenswrapper[4867]: E0320 00:08:01.424862 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:08:01 crc kubenswrapper[4867]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 00:08:01 crc kubenswrapper[4867]: set -uo pipefail Mar 20 00:08:01 crc kubenswrapper[4867]: Mar 20 00:08:01 crc kubenswrapper[4867]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 00:08:01 crc kubenswrapper[4867]: Mar 20 00:08:01 crc kubenswrapper[4867]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 00:08:01 crc kubenswrapper[4867]: HOSTS_FILE="/etc/hosts" Mar 20 00:08:01 crc kubenswrapper[4867]: TEMP_FILE="/etc/hosts.tmp" Mar 20 00:08:01 crc kubenswrapper[4867]: Mar 20 00:08:01 crc kubenswrapper[4867]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 00:08:01 crc kubenswrapper[4867]: Mar 20 00:08:01 crc kubenswrapper[4867]: # Make a temporary file with the old hosts file's attributes. Mar 20 00:08:01 crc kubenswrapper[4867]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 00:08:01 crc kubenswrapper[4867]: echo "Failed to preserve hosts file. Exiting." Mar 20 00:08:01 crc kubenswrapper[4867]: exit 1 Mar 20 00:08:01 crc kubenswrapper[4867]: fi Mar 20 00:08:01 crc kubenswrapper[4867]: Mar 20 00:08:01 crc kubenswrapper[4867]: while true; do Mar 20 00:08:01 crc kubenswrapper[4867]: declare -A svc_ips Mar 20 00:08:01 crc kubenswrapper[4867]: for svc in "${services[@]}"; do Mar 20 00:08:01 crc kubenswrapper[4867]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 00:08:01 crc kubenswrapper[4867]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 00:08:01 crc kubenswrapper[4867]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 00:08:01 crc kubenswrapper[4867]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 00:08:01 crc kubenswrapper[4867]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 00:08:01 crc kubenswrapper[4867]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 00:08:01 crc kubenswrapper[4867]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 00:08:01 crc kubenswrapper[4867]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 00:08:01 crc kubenswrapper[4867]: for i in ${!cmds[*]} Mar 20 00:08:01 crc kubenswrapper[4867]: do Mar 20 00:08:01 crc kubenswrapper[4867]: ips=($(eval "${cmds[i]}")) Mar 20 00:08:01 crc kubenswrapper[4867]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 00:08:01 crc kubenswrapper[4867]: svc_ips["${svc}"]="${ips[@]}" Mar 20 00:08:01 crc kubenswrapper[4867]: break Mar 20 00:08:01 crc kubenswrapper[4867]: fi Mar 20 00:08:01 crc kubenswrapper[4867]: done Mar 20 00:08:01 crc kubenswrapper[4867]: done Mar 20 00:08:01 crc kubenswrapper[4867]: Mar 20 00:08:01 crc kubenswrapper[4867]: # Update /etc/hosts only if we get valid service IPs Mar 20 00:08:01 crc kubenswrapper[4867]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 00:08:01 crc kubenswrapper[4867]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 00:08:01 crc kubenswrapper[4867]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 00:08:01 crc kubenswrapper[4867]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 00:08:01 crc kubenswrapper[4867]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 00:08:01 crc kubenswrapper[4867]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 00:08:01 crc kubenswrapper[4867]: sleep 60 & wait Mar 20 00:08:01 crc kubenswrapper[4867]: continue Mar 20 00:08:01 crc kubenswrapper[4867]: fi Mar 20 00:08:01 crc kubenswrapper[4867]: Mar 20 00:08:01 crc kubenswrapper[4867]: # Append resolver entries for services Mar 20 00:08:01 crc kubenswrapper[4867]: rc=0 Mar 20 00:08:01 crc kubenswrapper[4867]: for svc in "${!svc_ips[@]}"; do Mar 20 00:08:01 crc kubenswrapper[4867]: for ip in ${svc_ips[${svc}]}; do Mar 20 00:08:01 crc kubenswrapper[4867]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 00:08:01 crc kubenswrapper[4867]: done Mar 20 00:08:01 crc kubenswrapper[4867]: done Mar 20 00:08:01 crc kubenswrapper[4867]: if [[ $rc -ne 0 ]]; then Mar 20 00:08:01 crc kubenswrapper[4867]: sleep 60 & wait Mar 20 00:08:01 crc kubenswrapper[4867]: continue Mar 20 00:08:01 crc kubenswrapper[4867]: fi Mar 20 00:08:01 crc kubenswrapper[4867]: Mar 20 00:08:01 crc kubenswrapper[4867]: Mar 20 00:08:01 crc kubenswrapper[4867]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 00:08:01 crc kubenswrapper[4867]: # Replace /etc/hosts with our modified version if needed Mar 20 00:08:01 crc kubenswrapper[4867]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 00:08:01 crc kubenswrapper[4867]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 00:08:01 crc kubenswrapper[4867]: fi Mar 20 00:08:01 crc kubenswrapper[4867]: sleep 60 & wait Mar 20 00:08:01 crc kubenswrapper[4867]: unset svc_ips Mar 20 00:08:01 crc kubenswrapper[4867]: done Mar 20 00:08:01 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ff4ht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-zgbkt_openshift-dns(02c4bd4e-640d-48b8-8e73-3aead59105b9): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:08:01 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:08:01 crc kubenswrapper[4867]: E0320 00:08:01.424972 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" podUID="c16597ca-4e52-469a-b6c2-cb2c0d0f07cf" Mar 20 00:08:01 crc kubenswrapper[4867]: E0320 00:08:01.425031 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" Mar 20 00:08:01 crc kubenswrapper[4867]: E0320 00:08:01.425990 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-zgbkt" podUID="02c4bd4e-640d-48b8-8e73-3aead59105b9" Mar 20 00:08:01 crc kubenswrapper[4867]: E0320 00:08:01.426236 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:08:01 crc kubenswrapper[4867]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 00:08:01 crc kubenswrapper[4867]: if [[ -f "/env/_master" ]]; then Mar 20 00:08:01 crc kubenswrapper[4867]: set -o allexport Mar 20 00:08:01 crc kubenswrapper[4867]: source "/env/_master" Mar 20 00:08:01 crc kubenswrapper[4867]: set +o allexport Mar 20 00:08:01 crc kubenswrapper[4867]: fi Mar 20 00:08:01 crc kubenswrapper[4867]: Mar 20 00:08:01 crc kubenswrapper[4867]: ovn_v4_join_subnet_opt= Mar 20 00:08:01 crc kubenswrapper[4867]: if [[ "" != "" ]]; then Mar 20 00:08:01 crc kubenswrapper[4867]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 20 00:08:01 crc kubenswrapper[4867]: fi Mar 20 00:08:01 crc kubenswrapper[4867]: ovn_v6_join_subnet_opt= Mar 20 00:08:01 crc kubenswrapper[4867]: if [[ "" != "" ]]; then Mar 20 00:08:01 crc kubenswrapper[4867]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 20 00:08:01 crc kubenswrapper[4867]: fi Mar 20 00:08:01 crc kubenswrapper[4867]: Mar 20 00:08:01 crc kubenswrapper[4867]: ovn_v4_transit_switch_subnet_opt= Mar 20 00:08:01 crc kubenswrapper[4867]: if [[ "" != "" ]]; then Mar 20 00:08:01 crc kubenswrapper[4867]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 20 00:08:01 crc kubenswrapper[4867]: fi Mar 20 00:08:01 crc kubenswrapper[4867]: ovn_v6_transit_switch_subnet_opt= Mar 20 00:08:01 crc kubenswrapper[4867]: if [[ "" != "" ]]; then Mar 20 00:08:01 crc kubenswrapper[4867]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 20 00:08:01 crc kubenswrapper[4867]: fi Mar 20 00:08:01 crc kubenswrapper[4867]: Mar 20 00:08:01 crc kubenswrapper[4867]: dns_name_resolver_enabled_flag= Mar 20 00:08:01 crc kubenswrapper[4867]: if [[ "false" == "true" ]]; then Mar 20 00:08:01 crc kubenswrapper[4867]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 20 00:08:01 crc kubenswrapper[4867]: fi Mar 20 00:08:01 crc kubenswrapper[4867]: Mar 20 00:08:01 crc kubenswrapper[4867]: persistent_ips_enabled_flag= Mar 20 00:08:01 crc kubenswrapper[4867]: if [[ "true" == "true" ]]; then Mar 20 00:08:01 crc kubenswrapper[4867]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 20 00:08:01 crc kubenswrapper[4867]: fi Mar 20 00:08:01 crc kubenswrapper[4867]: Mar 20 00:08:01 crc kubenswrapper[4867]: # This is needed so that converting clusters from GA to TP Mar 20 00:08:01 crc kubenswrapper[4867]: # will rollout control plane pods as well Mar 20 00:08:01 crc kubenswrapper[4867]: network_segmentation_enabled_flag= Mar 20 00:08:01 crc kubenswrapper[4867]: multi_network_enabled_flag= Mar 20 00:08:01 crc kubenswrapper[4867]: if [[ "true" == "true" ]]; then Mar 20 00:08:01 crc kubenswrapper[4867]: multi_network_enabled_flag="--enable-multi-network" Mar 20 00:08:01 crc kubenswrapper[4867]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 20 00:08:01 crc kubenswrapper[4867]: fi Mar 20 00:08:01 crc kubenswrapper[4867]: Mar 20 00:08:01 crc kubenswrapper[4867]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 20 00:08:01 crc kubenswrapper[4867]: exec /usr/bin/ovnkube \ Mar 20 00:08:01 crc kubenswrapper[4867]: --enable-interconnect \ Mar 20 00:08:01 crc kubenswrapper[4867]: --init-cluster-manager "${K8S_NODE}" \ Mar 20 00:08:01 crc kubenswrapper[4867]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 20 00:08:01 crc kubenswrapper[4867]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 20 00:08:01 crc kubenswrapper[4867]: --metrics-bind-address "127.0.0.1:29108" \ Mar 20 00:08:01 crc kubenswrapper[4867]: --metrics-enable-pprof \ Mar 20 00:08:01 crc kubenswrapper[4867]: --metrics-enable-config-duration \ Mar 20 00:08:01 crc kubenswrapper[4867]: ${ovn_v4_join_subnet_opt} \ Mar 20 00:08:01 crc kubenswrapper[4867]: ${ovn_v6_join_subnet_opt} \ Mar 20 00:08:01 crc kubenswrapper[4867]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 20 00:08:01 crc kubenswrapper[4867]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 20 00:08:01 crc kubenswrapper[4867]: ${dns_name_resolver_enabled_flag} \ Mar 20 00:08:01 crc kubenswrapper[4867]: ${persistent_ips_enabled_flag} \ Mar 20 00:08:01 crc kubenswrapper[4867]: ${multi_network_enabled_flag} \ Mar 20 00:08:01 crc kubenswrapper[4867]: ${network_segmentation_enabled_flag} Mar 20 00:08:01 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7xl5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-tfl97_openshift-ovn-kubernetes(a4d1c044-4ed7-44b6-9cd0-e52371e17e40): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:08:01 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:08:01 crc kubenswrapper[4867]: E0320 00:08:01.427423 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" podUID="a4d1c044-4ed7-44b6-9cd0-e52371e17e40" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.515241 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.515293 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.515301 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.515314 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.515325 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:01Z","lastTransitionTime":"2026-03-20T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.617822 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.617869 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.617880 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.617897 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.617908 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:01Z","lastTransitionTime":"2026-03-20T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.719983 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.720027 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.720039 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.720056 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.720068 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:01Z","lastTransitionTime":"2026-03-20T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.821784 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.821834 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.821843 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.821858 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.821899 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:01Z","lastTransitionTime":"2026-03-20T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.924610 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.924655 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.924665 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.924681 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:01 crc kubenswrapper[4867]: I0320 00:08:01.924693 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:01Z","lastTransitionTime":"2026-03-20T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.027193 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.027252 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.027269 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.027294 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.027313 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:02Z","lastTransitionTime":"2026-03-20T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.129779 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.129814 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.129822 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.129834 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.129844 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:02Z","lastTransitionTime":"2026-03-20T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.232089 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.232118 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.232126 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.232137 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.232145 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:02Z","lastTransitionTime":"2026-03-20T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.334560 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.334621 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.334638 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.334661 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.334679 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:02Z","lastTransitionTime":"2026-03-20T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.420723 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.420852 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.420875 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:02 crc kubenswrapper[4867]: E0320 00:08:02.421062 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.421108 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:02 crc kubenswrapper[4867]: E0320 00:08:02.421669 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:02 crc kubenswrapper[4867]: E0320 00:08:02.421856 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:02 crc kubenswrapper[4867]: E0320 00:08:02.423696 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:02 crc kubenswrapper[4867]: E0320 00:08:02.425208 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:08:02 crc kubenswrapper[4867]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 00:08:02 crc kubenswrapper[4867]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 00:08:02 crc kubenswrapper[4867]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fvzk5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-98n2n_openshift-multus(97e52c03-2ca5-4cad-8459-f03029234544): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:08:02 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:08:02 crc kubenswrapper[4867]: E0320 00:08:02.426415 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-98n2n" podUID="97e52c03-2ca5-4cad-8459-f03029234544" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.438991 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.439045 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.439063 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.439089 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.439108 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:02Z","lastTransitionTime":"2026-03-20T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.541850 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.541893 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.541904 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.541919 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.541931 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:02Z","lastTransitionTime":"2026-03-20T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.645374 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.645438 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.645460 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.645487 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.645547 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:02Z","lastTransitionTime":"2026-03-20T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.748210 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.748256 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.748266 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.748283 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.748293 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:02Z","lastTransitionTime":"2026-03-20T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.851532 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.851592 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.851615 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.851642 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.851663 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:02Z","lastTransitionTime":"2026-03-20T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.954411 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.954477 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.954534 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.954562 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:02 crc kubenswrapper[4867]: I0320 00:08:02.954579 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:02Z","lastTransitionTime":"2026-03-20T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.057066 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.057543 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.057771 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.057922 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.058056 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:03Z","lastTransitionTime":"2026-03-20T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.160401 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.160790 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.161002 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.161168 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.161303 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:03Z","lastTransitionTime":"2026-03-20T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.264379 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.264434 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.264447 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.264467 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.264480 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:03Z","lastTransitionTime":"2026-03-20T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.367355 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.367446 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.367470 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.367560 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.367584 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:03Z","lastTransitionTime":"2026-03-20T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.422103 4867 scope.go:117] "RemoveContainer" containerID="ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894" Mar 20 00:08:03 crc kubenswrapper[4867]: E0320 00:08:03.422303 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.470832 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.470868 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.470880 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.470899 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.470912 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:03Z","lastTransitionTime":"2026-03-20T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.573724 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.573766 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.573777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.573793 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.573805 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:03Z","lastTransitionTime":"2026-03-20T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.677203 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.677270 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.677288 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.677313 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.677332 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:03Z","lastTransitionTime":"2026-03-20T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.780307 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.780369 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.780388 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.780416 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.780435 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:03Z","lastTransitionTime":"2026-03-20T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.883602 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.883671 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.883696 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.883728 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.883750 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:03Z","lastTransitionTime":"2026-03-20T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.987305 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.987359 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.987376 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.987399 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:03 crc kubenswrapper[4867]: I0320 00:08:03.987416 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:03Z","lastTransitionTime":"2026-03-20T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.090467 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.090530 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.090539 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.090554 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.090564 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:04Z","lastTransitionTime":"2026-03-20T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.193457 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.193566 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.193590 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.193620 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.193642 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:04Z","lastTransitionTime":"2026-03-20T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.207667 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.207727 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.207753 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.207775 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.207851 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.207889 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 00:08:20.207876894 +0000 UTC m=+114.434414411 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.207943 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.207976 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.207994 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.208016 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:08:20.207982946 +0000 UTC m=+114.434520493 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.208031 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.208056 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 00:08:20.208038108 +0000 UTC m=+114.434575655 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.208164 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 00:08:20.20813674 +0000 UTC m=+114.434674287 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.298734 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.298838 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.298855 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.299250 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.299299 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:04Z","lastTransitionTime":"2026-03-20T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.309340 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs\") pod \"network-metrics-daemon-rkq8h\" (UID: \"0e040dc6-20c6-4d82-b719-bf25fa43db67\") " pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.309415 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.309613 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.309714 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.309758 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.309783 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.309732 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs podName:0e040dc6-20c6-4d82-b719-bf25fa43db67 nodeName:}" failed. No retries permitted until 2026-03-20 00:08:20.309704099 +0000 UTC m=+114.536241646 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs") pod "network-metrics-daemon-rkq8h" (UID: "0e040dc6-20c6-4d82-b719-bf25fa43db67") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.309923 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 00:08:20.309888074 +0000 UTC m=+114.536425621 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.401951 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.401997 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.402015 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.402041 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.402059 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:04Z","lastTransitionTime":"2026-03-20T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.425728 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.426033 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.426624 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.426759 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.428328 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.428638 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.428852 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.428991 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.429447 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:08:04 crc kubenswrapper[4867]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 00:08:04 crc kubenswrapper[4867]: set -o allexport Mar 20 00:08:04 crc kubenswrapper[4867]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 00:08:04 crc kubenswrapper[4867]: source /etc/kubernetes/apiserver-url.env Mar 20 00:08:04 crc kubenswrapper[4867]: else Mar 20 00:08:04 crc kubenswrapper[4867]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 00:08:04 crc kubenswrapper[4867]: exit 1 Mar 20 00:08:04 crc kubenswrapper[4867]: fi Mar 20 00:08:04 crc kubenswrapper[4867]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 00:08:04 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:08:04 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.431133 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:08:04 crc kubenswrapper[4867]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 00:08:04 crc kubenswrapper[4867]: if [[ -f "/env/_master" ]]; then Mar 20 00:08:04 crc kubenswrapper[4867]: set -o allexport Mar 20 00:08:04 crc kubenswrapper[4867]: source "/env/_master" Mar 20 00:08:04 crc kubenswrapper[4867]: set +o allexport Mar 20 00:08:04 crc kubenswrapper[4867]: fi Mar 20 00:08:04 crc kubenswrapper[4867]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 00:08:04 crc kubenswrapper[4867]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 00:08:04 crc kubenswrapper[4867]: ho_enable="--enable-hybrid-overlay" Mar 20 00:08:04 crc kubenswrapper[4867]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 00:08:04 crc kubenswrapper[4867]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 00:08:04 crc kubenswrapper[4867]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 00:08:04 crc kubenswrapper[4867]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 00:08:04 crc kubenswrapper[4867]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 00:08:04 crc kubenswrapper[4867]: --webhook-host=127.0.0.1 \ Mar 20 00:08:04 crc kubenswrapper[4867]: --webhook-port=9743 \ Mar 20 00:08:04 crc kubenswrapper[4867]: ${ho_enable} \ Mar 20 00:08:04 crc kubenswrapper[4867]: --enable-interconnect \ Mar 20 00:08:04 crc kubenswrapper[4867]: --disable-approver \ Mar 20 00:08:04 crc kubenswrapper[4867]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 00:08:04 crc kubenswrapper[4867]: --wait-for-kubernetes-api=200s \ Mar 20 00:08:04 crc kubenswrapper[4867]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 00:08:04 crc kubenswrapper[4867]: --loglevel="${LOGLEVEL}" Mar 20 00:08:04 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:08:04 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.431385 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.431420 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.431587 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:08:04 crc kubenswrapper[4867]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 00:08:04 crc kubenswrapper[4867]: while [ true ]; Mar 20 00:08:04 crc kubenswrapper[4867]: do Mar 20 00:08:04 crc kubenswrapper[4867]: for f in $(ls /tmp/serviceca); do Mar 20 00:08:04 crc kubenswrapper[4867]: echo $f Mar 20 00:08:04 crc kubenswrapper[4867]: ca_file_path="/tmp/serviceca/${f}" Mar 20 00:08:04 crc kubenswrapper[4867]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 00:08:04 crc kubenswrapper[4867]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 00:08:04 crc kubenswrapper[4867]: if [ -e "${reg_dir_path}" ]; then Mar 20 00:08:04 crc kubenswrapper[4867]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 00:08:04 crc kubenswrapper[4867]: else Mar 20 00:08:04 crc kubenswrapper[4867]: mkdir $reg_dir_path Mar 20 00:08:04 crc kubenswrapper[4867]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 00:08:04 crc kubenswrapper[4867]: fi Mar 20 00:08:04 crc kubenswrapper[4867]: done Mar 20 00:08:04 crc kubenswrapper[4867]: for d in $(ls /etc/docker/certs.d); do Mar 20 00:08:04 crc kubenswrapper[4867]: echo $d Mar 20 00:08:04 crc kubenswrapper[4867]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 00:08:04 crc kubenswrapper[4867]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 00:08:04 crc kubenswrapper[4867]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 00:08:04 crc kubenswrapper[4867]: rm -rf /etc/docker/certs.d/$d Mar 20 00:08:04 crc kubenswrapper[4867]: fi Mar 20 00:08:04 crc kubenswrapper[4867]: done Mar 20 00:08:04 crc kubenswrapper[4867]: sleep 60 & wait ${!} Mar 20 00:08:04 crc kubenswrapper[4867]: done Mar 20 00:08:04 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6kk54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-2xwxb_openshift-image-registry(c6d1df48-b16a-4691-82bf-68d8cce94a42): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:08:04 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.433131 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-2xwxb" podUID="c6d1df48-b16a-4691-82bf-68d8cce94a42" Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.433247 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.434615 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:08:04 crc kubenswrapper[4867]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 00:08:04 crc kubenswrapper[4867]: if [[ -f "/env/_master" ]]; then Mar 20 00:08:04 crc kubenswrapper[4867]: set -o allexport Mar 20 00:08:04 crc kubenswrapper[4867]: source "/env/_master" Mar 20 00:08:04 crc kubenswrapper[4867]: set +o allexport Mar 20 00:08:04 crc kubenswrapper[4867]: fi Mar 20 00:08:04 crc kubenswrapper[4867]: Mar 20 00:08:04 crc kubenswrapper[4867]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 00:08:04 crc kubenswrapper[4867]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 00:08:04 crc kubenswrapper[4867]: --disable-webhook \ Mar 20 00:08:04 crc kubenswrapper[4867]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 00:08:04 crc kubenswrapper[4867]: --loglevel="${LOGLEVEL}" Mar 20 00:08:04 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 00:08:04 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:08:04 crc kubenswrapper[4867]: E0320 00:08:04.435862 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.505931 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.505993 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.506071 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.506094 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.506112 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:04Z","lastTransitionTime":"2026-03-20T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.609411 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.609471 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.609535 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.609565 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.609588 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:04Z","lastTransitionTime":"2026-03-20T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.713284 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.713328 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.713418 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.713452 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.713531 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:04Z","lastTransitionTime":"2026-03-20T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.815221 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.815277 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.815296 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.815318 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.815336 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:04Z","lastTransitionTime":"2026-03-20T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.917710 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.917770 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.917794 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.917820 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:04 crc kubenswrapper[4867]: I0320 00:08:04.917842 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:04Z","lastTransitionTime":"2026-03-20T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.021282 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.021342 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.021367 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.021393 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.021413 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:05Z","lastTransitionTime":"2026-03-20T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.124027 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.124081 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.124092 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.124112 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.124125 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:05Z","lastTransitionTime":"2026-03-20T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.227147 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.227200 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.227219 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.227240 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.227254 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:05Z","lastTransitionTime":"2026-03-20T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.330657 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.330705 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.330720 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.330751 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.330768 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:05Z","lastTransitionTime":"2026-03-20T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.414315 4867 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.433139 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.433201 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.433217 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.433240 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.433257 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:05Z","lastTransitionTime":"2026-03-20T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.536851 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.536918 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.536939 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.536966 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.536995 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:05Z","lastTransitionTime":"2026-03-20T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.640413 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.640754 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.640773 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.640799 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.640820 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:05Z","lastTransitionTime":"2026-03-20T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.743965 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.744022 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.744039 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.744064 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.744082 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:05Z","lastTransitionTime":"2026-03-20T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.821890 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" event={"ID":"00eacbd3-d921-414b-8b8d-c4298bdd5a28","Type":"ContainerStarted","Data":"e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e"} Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.821981 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" event={"ID":"00eacbd3-d921-414b-8b8d-c4298bdd5a28","Type":"ContainerStarted","Data":"d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee"} Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.840112 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.852071 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.852130 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.852148 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.852174 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.852192 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:05Z","lastTransitionTime":"2026-03-20T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.871189 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.886603 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.901891 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.920561 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.933111 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.952472 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.954656 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.954689 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.954699 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.955463 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.955487 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:05Z","lastTransitionTime":"2026-03-20T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.964643 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.978287 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.989987 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:05 crc kubenswrapper[4867]: I0320 00:08:05.999268 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.007472 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.021551 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.030978 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.037798 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.046186 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.058564 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.058615 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.058632 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.058648 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.058661 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:06Z","lastTransitionTime":"2026-03-20T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.162108 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.162148 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.162161 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.162178 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.162193 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:06Z","lastTransitionTime":"2026-03-20T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.265299 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.265379 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.265400 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.265429 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.265452 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:06Z","lastTransitionTime":"2026-03-20T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.362281 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.362344 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.362360 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.362383 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.362400 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:06Z","lastTransitionTime":"2026-03-20T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:06 crc kubenswrapper[4867]: E0320 00:08:06.382852 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.388443 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.388583 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.388606 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.388630 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.388650 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:06Z","lastTransitionTime":"2026-03-20T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:06 crc kubenswrapper[4867]: E0320 00:08:06.404791 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.409757 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.409805 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.409823 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.409846 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.409864 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:06Z","lastTransitionTime":"2026-03-20T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.421703 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.421756 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:06 crc kubenswrapper[4867]: E0320 00:08:06.421910 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.421955 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.421969 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:06 crc kubenswrapper[4867]: E0320 00:08:06.422067 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:06 crc kubenswrapper[4867]: E0320 00:08:06.422212 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:06 crc kubenswrapper[4867]: E0320 00:08:06.422311 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:06 crc kubenswrapper[4867]: E0320 00:08:06.427291 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.432169 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.432198 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.432210 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.432224 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.432235 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:06Z","lastTransitionTime":"2026-03-20T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.436299 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: E0320 00:08:06.449895 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.455765 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.455815 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.455831 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.455853 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.455871 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:06Z","lastTransitionTime":"2026-03-20T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:06 crc kubenswrapper[4867]: E0320 00:08:06.473177 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: E0320 00:08:06.473379 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.473809 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.476068 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.476109 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.476119 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.476194 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.476211 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:06Z","lastTransitionTime":"2026-03-20T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.488651 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.499183 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.509226 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.526107 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.539302 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.554662 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.565691 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.577760 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.578322 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.578383 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.578394 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.578414 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.578455 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:06Z","lastTransitionTime":"2026-03-20T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.591415 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.599479 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.614961 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.626825 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.635760 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.647408 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.681160 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.681207 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.681219 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.681236 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.681250 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:06Z","lastTransitionTime":"2026-03-20T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.783988 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.784310 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.784329 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.784351 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.784392 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:06Z","lastTransitionTime":"2026-03-20T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.887030 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.887084 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.887100 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.887122 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.887140 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:06Z","lastTransitionTime":"2026-03-20T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.989792 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.989826 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.989834 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.989846 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:06 crc kubenswrapper[4867]: I0320 00:08:06.989855 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:06Z","lastTransitionTime":"2026-03-20T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.092387 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.092455 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.092477 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.092546 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.092572 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:07Z","lastTransitionTime":"2026-03-20T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.194833 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.194903 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.194926 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.194953 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.194974 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:07Z","lastTransitionTime":"2026-03-20T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.297365 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.297436 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.297453 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.297476 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.297525 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:07Z","lastTransitionTime":"2026-03-20T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.400405 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.400553 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.400581 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.400611 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.400633 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:07Z","lastTransitionTime":"2026-03-20T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.504549 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.504664 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.504686 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.504709 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.504803 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:07Z","lastTransitionTime":"2026-03-20T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.618011 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.618054 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.618063 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.618080 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.618091 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:07Z","lastTransitionTime":"2026-03-20T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.722971 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.723041 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.723055 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.723077 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.723093 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:07Z","lastTransitionTime":"2026-03-20T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.824911 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.824945 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.824959 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.824979 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.824994 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:07Z","lastTransitionTime":"2026-03-20T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.926960 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.926981 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.926989 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.927001 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:07 crc kubenswrapper[4867]: I0320 00:08:07.927009 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:07Z","lastTransitionTime":"2026-03-20T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.029840 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.029896 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.029919 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.029947 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.029969 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:08Z","lastTransitionTime":"2026-03-20T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.133657 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.134071 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.134233 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.134408 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.134592 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:08Z","lastTransitionTime":"2026-03-20T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.237596 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.237663 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.237690 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.237721 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.237745 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:08Z","lastTransitionTime":"2026-03-20T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.340479 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.340575 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.340593 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.340618 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.340636 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:08Z","lastTransitionTime":"2026-03-20T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.420561 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.420614 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.420614 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.420576 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:08 crc kubenswrapper[4867]: E0320 00:08:08.420752 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:08 crc kubenswrapper[4867]: E0320 00:08:08.420889 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:08 crc kubenswrapper[4867]: E0320 00:08:08.420983 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:08 crc kubenswrapper[4867]: E0320 00:08:08.421078 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.444736 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.444879 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.444900 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.444955 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.444973 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:08Z","lastTransitionTime":"2026-03-20T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.547590 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.547648 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.547661 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.547681 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.547694 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:08Z","lastTransitionTime":"2026-03-20T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.651146 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.651198 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.651217 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.651237 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.651250 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:08Z","lastTransitionTime":"2026-03-20T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.755093 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.755162 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.755181 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.755207 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.755225 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:08Z","lastTransitionTime":"2026-03-20T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.858368 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.858429 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.858450 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.858477 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.858539 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:08Z","lastTransitionTime":"2026-03-20T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.961724 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.961796 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.961816 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.961864 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:08 crc kubenswrapper[4867]: I0320 00:08:08.961891 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:08Z","lastTransitionTime":"2026-03-20T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.065365 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.065413 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.065423 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.065439 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.065451 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:09Z","lastTransitionTime":"2026-03-20T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.168885 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.168963 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.168989 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.169053 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.169081 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:09Z","lastTransitionTime":"2026-03-20T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.271405 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.271474 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.271523 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.271548 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.271568 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:09Z","lastTransitionTime":"2026-03-20T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.373978 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.374037 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.374052 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.374074 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.374091 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:09Z","lastTransitionTime":"2026-03-20T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.475947 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.475998 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.476015 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.476038 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.476056 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:09Z","lastTransitionTime":"2026-03-20T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.579019 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.579083 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.579101 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.579128 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.579151 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:09Z","lastTransitionTime":"2026-03-20T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.682236 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.682672 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.682690 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.682714 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.682733 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:09Z","lastTransitionTime":"2026-03-20T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.786048 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.786106 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.786117 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.786132 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.786142 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:09Z","lastTransitionTime":"2026-03-20T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.888959 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.889000 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.889010 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.889024 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:09 crc kubenswrapper[4867]: I0320 00:08:09.889033 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:09Z","lastTransitionTime":"2026-03-20T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.003071 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.003113 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.003121 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.003139 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.003155 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:10Z","lastTransitionTime":"2026-03-20T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.110223 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.110280 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.110295 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.110315 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.110330 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:10Z","lastTransitionTime":"2026-03-20T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.212741 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.212783 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.212794 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.212810 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.212822 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:10Z","lastTransitionTime":"2026-03-20T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.315721 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.315788 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.315806 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.315830 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.315849 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:10Z","lastTransitionTime":"2026-03-20T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.418743 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.418789 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.418806 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.418829 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.418845 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:10Z","lastTransitionTime":"2026-03-20T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.421298 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.421370 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.421379 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.421457 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:10 crc kubenswrapper[4867]: E0320 00:08:10.421657 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:10 crc kubenswrapper[4867]: E0320 00:08:10.421745 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:10 crc kubenswrapper[4867]: E0320 00:08:10.421868 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:10 crc kubenswrapper[4867]: E0320 00:08:10.422272 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.522001 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.522075 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.522098 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.522131 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.522153 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:10Z","lastTransitionTime":"2026-03-20T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.625732 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.625786 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.625802 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.625824 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.625843 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:10Z","lastTransitionTime":"2026-03-20T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.730119 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.730175 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.730197 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.730227 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.730249 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:10Z","lastTransitionTime":"2026-03-20T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.833597 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.833867 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.834017 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.834170 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.834302 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:10Z","lastTransitionTime":"2026-03-20T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.937628 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.937668 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.937678 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.937696 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:10 crc kubenswrapper[4867]: I0320 00:08:10.937707 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:10Z","lastTransitionTime":"2026-03-20T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.040958 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.041008 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.041027 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.041047 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.041066 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:11Z","lastTransitionTime":"2026-03-20T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.143589 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.143653 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.143670 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.143693 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.143711 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:11Z","lastTransitionTime":"2026-03-20T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.246930 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.246988 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.247004 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.247026 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.247042 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:11Z","lastTransitionTime":"2026-03-20T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.349897 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.349950 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.349967 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.349990 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.350005 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:11Z","lastTransitionTime":"2026-03-20T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.453356 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.453425 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.453446 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.453475 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.453543 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:11Z","lastTransitionTime":"2026-03-20T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.556718 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.556775 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.556793 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.556817 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.556835 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:11Z","lastTransitionTime":"2026-03-20T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.659874 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.659939 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.659964 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.659996 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.660020 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:11Z","lastTransitionTime":"2026-03-20T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.762641 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.762709 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.762727 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.762761 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.762785 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:11Z","lastTransitionTime":"2026-03-20T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.865815 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.865888 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.865910 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.865944 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.865968 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:11Z","lastTransitionTime":"2026-03-20T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.969324 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.969405 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.969442 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.969472 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:11 crc kubenswrapper[4867]: I0320 00:08:11.969541 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:11Z","lastTransitionTime":"2026-03-20T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.072340 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.072403 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.072420 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.072452 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.072471 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:12Z","lastTransitionTime":"2026-03-20T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.175601 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.175652 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.175671 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.175693 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.175710 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:12Z","lastTransitionTime":"2026-03-20T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.278631 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.278709 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.278727 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.279304 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.279363 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:12Z","lastTransitionTime":"2026-03-20T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.382006 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.382085 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.382110 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.382149 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.382172 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:12Z","lastTransitionTime":"2026-03-20T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.421320 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.421536 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.421626 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:12 crc kubenswrapper[4867]: E0320 00:08:12.421622 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:12 crc kubenswrapper[4867]: E0320 00:08:12.421750 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.421852 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:12 crc kubenswrapper[4867]: E0320 00:08:12.422026 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:12 crc kubenswrapper[4867]: E0320 00:08:12.422137 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.485613 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.485657 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.485667 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.485686 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.485697 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:12Z","lastTransitionTime":"2026-03-20T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.595264 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.595343 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.595367 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.595397 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.595421 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:12Z","lastTransitionTime":"2026-03-20T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.698479 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.698577 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.698591 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.698614 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.698642 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:12Z","lastTransitionTime":"2026-03-20T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.801143 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.801195 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.801212 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.801235 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.801253 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:12Z","lastTransitionTime":"2026-03-20T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.843309 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zgbkt" event={"ID":"02c4bd4e-640d-48b8-8e73-3aead59105b9","Type":"ContainerStarted","Data":"21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899"} Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.859788 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.873688 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.886822 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.898047 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.903984 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.904047 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.904068 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.904094 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.904113 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:12Z","lastTransitionTime":"2026-03-20T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.926102 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.943062 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.954362 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.971413 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:12 crc kubenswrapper[4867]: I0320 00:08:12.982624 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.008204 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.008261 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.008278 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.008304 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.008321 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:13Z","lastTransitionTime":"2026-03-20T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.015430 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.029910 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.047473 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.066009 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.083682 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.103163 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.113401 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.113455 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.113474 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.113525 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.113544 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:13Z","lastTransitionTime":"2026-03-20T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.120117 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.218012 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.218455 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.218779 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.219006 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.219193 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:13Z","lastTransitionTime":"2026-03-20T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.322872 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.322934 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.322949 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.322972 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.322990 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:13Z","lastTransitionTime":"2026-03-20T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.426825 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.426867 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.426925 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.426950 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.426968 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:13Z","lastTransitionTime":"2026-03-20T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.531869 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.531928 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.531948 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.531975 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.531994 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:13Z","lastTransitionTime":"2026-03-20T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.635819 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.636144 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.636153 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.636167 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.636176 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:13Z","lastTransitionTime":"2026-03-20T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.738836 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.738900 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.738925 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.738949 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.738966 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:13Z","lastTransitionTime":"2026-03-20T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.842258 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.842876 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.842906 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.842933 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.842957 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:13Z","lastTransitionTime":"2026-03-20T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.848252 4867 generic.go:334] "Generic (PLEG): container finished" podID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerID="d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6" exitCode=0 Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.848292 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerDied","Data":"d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6"} Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.865192 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.886457 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.899652 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.926456 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.940615 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.945304 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.945363 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.945385 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.945416 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.945440 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:13Z","lastTransitionTime":"2026-03-20T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.956658 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.973692 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:13 crc kubenswrapper[4867]: I0320 00:08:13.989869 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.005739 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.017212 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.031044 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.040414 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.048227 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.048395 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.048474 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.048582 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.048646 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:14Z","lastTransitionTime":"2026-03-20T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.053318 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.062552 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.078621 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.095680 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.150779 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.150825 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.150837 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.150855 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.150867 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:14Z","lastTransitionTime":"2026-03-20T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.253767 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.253796 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.253804 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.253817 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.253826 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:14Z","lastTransitionTime":"2026-03-20T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.356093 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.356135 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.356146 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.356163 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.356194 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:14Z","lastTransitionTime":"2026-03-20T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.421034 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:14 crc kubenswrapper[4867]: E0320 00:08:14.421203 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.421341 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.421474 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.421524 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:14 crc kubenswrapper[4867]: E0320 00:08:14.421536 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:14 crc kubenswrapper[4867]: E0320 00:08:14.422011 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.422380 4867 scope.go:117] "RemoveContainer" containerID="ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894" Mar 20 00:08:14 crc kubenswrapper[4867]: E0320 00:08:14.426220 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:14 crc kubenswrapper[4867]: E0320 00:08:14.426561 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.459448 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.459608 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.459631 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.459656 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.459674 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:14Z","lastTransitionTime":"2026-03-20T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.562895 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.563042 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.563064 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.563089 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.563108 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:14Z","lastTransitionTime":"2026-03-20T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.666282 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.666340 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.666358 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.666384 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.666402 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:14Z","lastTransitionTime":"2026-03-20T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.768994 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.769053 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.769065 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.769083 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.769095 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:14Z","lastTransitionTime":"2026-03-20T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.856258 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerStarted","Data":"a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8"} Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.856304 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerStarted","Data":"6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66"} Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.856318 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerStarted","Data":"3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a"} Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.856334 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerStarted","Data":"23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e"} Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.856346 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerStarted","Data":"079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd"} Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.856358 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerStarted","Data":"497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae"} Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.870980 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.871038 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.871054 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.871079 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.871097 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:14Z","lastTransitionTime":"2026-03-20T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.974082 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.974138 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.974154 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.974177 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:14 crc kubenswrapper[4867]: I0320 00:08:14.974194 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:14Z","lastTransitionTime":"2026-03-20T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.077203 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.077262 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.077299 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.077332 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.077353 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:15Z","lastTransitionTime":"2026-03-20T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.179953 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.180030 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.180052 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.180081 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.180103 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:15Z","lastTransitionTime":"2026-03-20T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.282316 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.282364 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.282376 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.282394 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.282429 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:15Z","lastTransitionTime":"2026-03-20T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.384648 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.384676 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.384686 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.384701 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.384712 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:15Z","lastTransitionTime":"2026-03-20T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.486606 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.486641 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.486652 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.486666 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.486678 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:15Z","lastTransitionTime":"2026-03-20T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.588820 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.588846 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.588855 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.588867 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.588875 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:15Z","lastTransitionTime":"2026-03-20T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.692995 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.693028 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.693039 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.693055 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.693065 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:15Z","lastTransitionTime":"2026-03-20T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.795836 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.795902 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.795920 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.795975 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.795994 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:15Z","lastTransitionTime":"2026-03-20T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.861557 4867 generic.go:334] "Generic (PLEG): container finished" podID="c16597ca-4e52-469a-b6c2-cb2c0d0f07cf" containerID="d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b" exitCode=0 Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.861617 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" event={"ID":"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf","Type":"ContainerDied","Data":"d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b"} Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.878464 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.889075 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.898271 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.898355 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.898378 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.898409 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.898435 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:15Z","lastTransitionTime":"2026-03-20T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.902812 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.918029 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.933898 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.947128 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.960999 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.975272 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:15 crc kubenswrapper[4867]: I0320 00:08:15.989259 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.002073 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.003054 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.003094 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.003109 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.003135 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.003154 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:16Z","lastTransitionTime":"2026-03-20T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.018574 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.032954 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.048220 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.068314 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.080293 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.107729 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.107907 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.107991 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.108057 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.108111 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:16Z","lastTransitionTime":"2026-03-20T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.111070 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.211954 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.212020 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.212034 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.212056 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.212072 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:16Z","lastTransitionTime":"2026-03-20T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.316412 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.316549 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.316569 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.316594 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.316613 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:16Z","lastTransitionTime":"2026-03-20T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.419045 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.419207 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.419235 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.419259 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.419277 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:16Z","lastTransitionTime":"2026-03-20T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.421069 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.421298 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:16 crc kubenswrapper[4867]: E0320 00:08:16.421559 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:16 crc kubenswrapper[4867]: E0320 00:08:16.421712 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.421744 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.421758 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:16 crc kubenswrapper[4867]: E0320 00:08:16.422028 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:16 crc kubenswrapper[4867]: E0320 00:08:16.422105 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.442866 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.465249 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.484253 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.495433 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.514125 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.528347 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.528668 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.528678 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.528691 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.528675 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.528701 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:16Z","lastTransitionTime":"2026-03-20T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.544535 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.564059 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.573196 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.594815 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.609968 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.621276 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.631935 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.631962 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.631971 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.631985 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.631994 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:16Z","lastTransitionTime":"2026-03-20T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.632660 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.640389 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.648013 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.657478 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.698673 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.698697 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.698706 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.698719 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.698729 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:16Z","lastTransitionTime":"2026-03-20T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:16 crc kubenswrapper[4867]: E0320 00:08:16.708726 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.712895 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.712946 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.712959 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.712979 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.712990 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:16Z","lastTransitionTime":"2026-03-20T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:16 crc kubenswrapper[4867]: E0320 00:08:16.725107 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.729287 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.729312 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.729320 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.729333 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.729342 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:16Z","lastTransitionTime":"2026-03-20T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:16 crc kubenswrapper[4867]: E0320 00:08:16.736976 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.743970 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.743998 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.744008 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.744025 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.744036 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:16Z","lastTransitionTime":"2026-03-20T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:16 crc kubenswrapper[4867]: E0320 00:08:16.757243 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.760832 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.760880 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.760902 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.760922 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.760935 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:16Z","lastTransitionTime":"2026-03-20T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:16 crc kubenswrapper[4867]: E0320 00:08:16.771847 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: E0320 00:08:16.772012 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.773706 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.773735 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.773744 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.773756 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.773764 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:16Z","lastTransitionTime":"2026-03-20T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.865798 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" event={"ID":"a4d1c044-4ed7-44b6-9cd0-e52371e17e40","Type":"ContainerStarted","Data":"b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2"} Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.865847 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" event={"ID":"a4d1c044-4ed7-44b6-9cd0-e52371e17e40","Type":"ContainerStarted","Data":"96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30"} Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.867858 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7"} Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.869354 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a"} Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.869405 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089"} Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.872705 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerStarted","Data":"208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88"} Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.874482 4867 generic.go:334] "Generic (PLEG): container finished" podID="c16597ca-4e52-469a-b6c2-cb2c0d0f07cf" containerID="a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95" exitCode=0 Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.874524 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" event={"ID":"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf","Type":"ContainerDied","Data":"a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95"} Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.875626 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.875665 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.875680 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.875700 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.875713 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:16Z","lastTransitionTime":"2026-03-20T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.876415 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-98n2n" event={"ID":"97e52c03-2ca5-4cad-8459-f03029234544","Type":"ContainerStarted","Data":"b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab"} Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.877670 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.888401 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.900169 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.913648 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.920780 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.939334 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.951670 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.961530 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.972656 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.980622 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.980796 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.980990 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.981181 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.981359 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:16Z","lastTransitionTime":"2026-03-20T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:16 crc kubenswrapper[4867]: I0320 00:08:16.991965 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.003816 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.012104 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.020578 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.028432 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.036888 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.043827 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.050648 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.059300 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.067396 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.084726 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.086521 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.086555 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.086567 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.086583 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.086595 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:17Z","lastTransitionTime":"2026-03-20T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.095925 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.102694 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.110947 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.120358 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.139998 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.151965 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.161745 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.173664 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.182987 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.188915 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.188963 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.188976 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.188995 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.189008 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:17Z","lastTransitionTime":"2026-03-20T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.196869 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.215125 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.226613 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.291969 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.292046 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.292066 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.292091 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.292109 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:17Z","lastTransitionTime":"2026-03-20T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.395116 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.395156 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.395164 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.395178 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.395188 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:17Z","lastTransitionTime":"2026-03-20T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.498213 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.498550 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.498560 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.498574 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.498583 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:17Z","lastTransitionTime":"2026-03-20T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.604780 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.604819 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.604831 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.604850 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.604863 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:17Z","lastTransitionTime":"2026-03-20T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.707592 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.707697 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.707716 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.707770 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.707788 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:17Z","lastTransitionTime":"2026-03-20T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.810782 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.810850 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.810869 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.810896 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.810918 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:17Z","lastTransitionTime":"2026-03-20T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.881087 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2xwxb" event={"ID":"c6d1df48-b16a-4691-82bf-68d8cce94a42","Type":"ContainerStarted","Data":"2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee"} Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.883553 4867 generic.go:334] "Generic (PLEG): container finished" podID="c16597ca-4e52-469a-b6c2-cb2c0d0f07cf" containerID="17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2" exitCode=0 Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.883594 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" event={"ID":"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf","Type":"ContainerDied","Data":"17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2"} Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.913442 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.913475 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.913483 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.913509 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.913518 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:17Z","lastTransitionTime":"2026-03-20T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.919225 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.938754 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.952384 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.974114 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.987029 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:17 crc kubenswrapper[4867]: I0320 00:08:17.998288 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.011633 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.015846 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.015891 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.015933 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.015956 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.015971 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:18Z","lastTransitionTime":"2026-03-20T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.023869 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.035251 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.050390 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.066982 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.086006 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.153179 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.154553 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.154585 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.154594 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.154607 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.154616 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:18Z","lastTransitionTime":"2026-03-20T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.171029 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.181343 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.192825 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.211203 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.223955 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.234931 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.244020 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.252972 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.256619 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.256665 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.256673 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.256688 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.256699 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:18Z","lastTransitionTime":"2026-03-20T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.264368 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.274575 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.286766 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.300095 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.314170 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.330038 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.339871 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.360068 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.360896 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.360921 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.360930 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.360942 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.360963 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:18Z","lastTransitionTime":"2026-03-20T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.375756 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.389256 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.403869 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.421305 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.421390 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:18 crc kubenswrapper[4867]: E0320 00:08:18.421410 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.421437 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:18 crc kubenswrapper[4867]: E0320 00:08:18.421545 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.421571 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:18 crc kubenswrapper[4867]: E0320 00:08:18.421655 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:18 crc kubenswrapper[4867]: E0320 00:08:18.421688 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.465091 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.465132 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.465143 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.465160 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.465173 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:18Z","lastTransitionTime":"2026-03-20T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.567612 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.567645 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.567653 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.567665 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.567673 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:18Z","lastTransitionTime":"2026-03-20T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.669992 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.670037 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.670047 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.670060 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.670071 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:18Z","lastTransitionTime":"2026-03-20T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.771994 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.772034 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.772045 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.772060 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.772073 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:18Z","lastTransitionTime":"2026-03-20T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.874613 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.874663 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.874680 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.874705 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.874724 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:18Z","lastTransitionTime":"2026-03-20T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.890346 4867 generic.go:334] "Generic (PLEG): container finished" podID="c16597ca-4e52-469a-b6c2-cb2c0d0f07cf" containerID="941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe" exitCode=0 Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.890402 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" event={"ID":"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf","Type":"ContainerDied","Data":"941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe"} Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.919012 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.936220 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.961075 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.975591 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.977022 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.977056 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.977083 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.977100 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.977112 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:18Z","lastTransitionTime":"2026-03-20T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:18 crc kubenswrapper[4867]: I0320 00:08:18.993713 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.009020 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.022903 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.037428 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.046797 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.058313 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.068920 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.079388 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.079417 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.079427 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.079443 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.079452 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:19Z","lastTransitionTime":"2026-03-20T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.079850 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.094433 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.112657 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.125297 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.140873 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.182264 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.182309 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.182321 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.182339 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.182351 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:19Z","lastTransitionTime":"2026-03-20T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.284830 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.284904 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.284924 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.284944 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.284957 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:19Z","lastTransitionTime":"2026-03-20T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.387545 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.387586 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.387597 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.387615 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.387626 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:19Z","lastTransitionTime":"2026-03-20T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.490461 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.490558 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.490579 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.490605 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.490625 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:19Z","lastTransitionTime":"2026-03-20T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.593560 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.593596 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.593605 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.593618 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.593628 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:19Z","lastTransitionTime":"2026-03-20T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.700556 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.700610 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.700638 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.700665 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.700686 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:19Z","lastTransitionTime":"2026-03-20T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.803447 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.803515 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.803528 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.803545 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.803558 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:19Z","lastTransitionTime":"2026-03-20T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.898768 4867 generic.go:334] "Generic (PLEG): container finished" podID="c16597ca-4e52-469a-b6c2-cb2c0d0f07cf" containerID="e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac" exitCode=0 Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.898831 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" event={"ID":"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf","Type":"ContainerDied","Data":"e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac"} Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.909247 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.909301 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.909324 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.909354 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.909378 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:19Z","lastTransitionTime":"2026-03-20T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.910990 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerStarted","Data":"f7095e68944d86c6a0389988f9efab836fc063936148500e7d57c08aefbff2d0"} Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.911720 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.913972 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.921851 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.944217 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.949547 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.970549 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:19 crc kubenswrapper[4867]: I0320 00:08:19.983531 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.010844 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.014210 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.014264 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.014283 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.014307 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.014324 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:20Z","lastTransitionTime":"2026-03-20T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.032908 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.048771 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.062984 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.076562 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.099379 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.113351 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.117685 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.117746 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.117764 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.117787 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.117804 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:20Z","lastTransitionTime":"2026-03-20T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.124445 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.139532 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.151675 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.163912 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.177172 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.190696 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.202743 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.218971 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.220938 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.221034 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.221058 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.221085 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.221105 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:20Z","lastTransitionTime":"2026-03-20T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.230340 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.252092 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7095e68944d86c6a0389988f9efab836fc063936148500e7d57c08aefbff2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.267643 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.279658 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.295954 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.296189 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:08:52.2961543 +0000 UTC m=+146.522691827 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.296306 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.296389 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.296447 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.296610 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.296651 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.296675 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.296721 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.296757 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 00:08:52.296727314 +0000 UTC m=+146.523264871 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.296619 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.296337 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.296793 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 00:08:52.296772325 +0000 UTC m=+146.523309912 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.297233 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 00:08:52.297208346 +0000 UTC m=+146.523745923 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.317875 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.324898 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.324960 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.324979 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.325003 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.325021 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:20Z","lastTransitionTime":"2026-03-20T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.331745 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.353984 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.368294 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.383238 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.397156 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs\") pod \"network-metrics-daemon-rkq8h\" (UID: \"0e040dc6-20c6-4d82-b719-bf25fa43db67\") " pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.397224 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.397382 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.397473 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.397523 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs podName:0e040dc6-20c6-4d82-b719-bf25fa43db67 nodeName:}" failed. No retries permitted until 2026-03-20 00:08:52.397457643 +0000 UTC m=+146.623995200 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs") pod "network-metrics-daemon-rkq8h" (UID: "0e040dc6-20c6-4d82-b719-bf25fa43db67") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.397537 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.397561 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.397634 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 00:08:52.397609587 +0000 UTC m=+146.624147144 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.400406 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.416187 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.420632 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.420716 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.420817 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.420813 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.420867 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.421030 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.421191 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:20 crc kubenswrapper[4867]: E0320 00:08:20.421579 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.432599 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.432740 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.432881 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.432923 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.432949 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:20Z","lastTransitionTime":"2026-03-20T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.434818 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.535915 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.535979 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.535991 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.536011 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.536023 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:20Z","lastTransitionTime":"2026-03-20T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.639157 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.639215 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.639232 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.639254 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.639270 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:20Z","lastTransitionTime":"2026-03-20T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.741520 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.741556 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.741565 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.741578 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.741588 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:20Z","lastTransitionTime":"2026-03-20T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.844312 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.844357 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.844368 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.844384 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.844395 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:20Z","lastTransitionTime":"2026-03-20T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.923469 4867 generic.go:334] "Generic (PLEG): container finished" podID="c16597ca-4e52-469a-b6c2-cb2c0d0f07cf" containerID="2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f" exitCode=0 Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.923581 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" event={"ID":"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf","Type":"ContainerDied","Data":"2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f"} Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.925163 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85"} Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.925942 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.947238 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7095e68944d86c6a0389988f9efab836fc063936148500e7d57c08aefbff2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.947938 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.947983 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.947997 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.948017 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.948032 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:20Z","lastTransitionTime":"2026-03-20T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.951286 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.967642 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:20 crc kubenswrapper[4867]: I0320 00:08:20.981809 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.001577 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:20Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.011831 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.027164 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.039577 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.051396 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.051457 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.051469 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.051484 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.051519 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:21Z","lastTransitionTime":"2026-03-20T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.053269 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.066927 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.080839 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.092862 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.115243 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.128608 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.139243 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.158188 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.158232 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.158248 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.158266 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.158278 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:21Z","lastTransitionTime":"2026-03-20T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.167805 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.192435 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.210345 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.224519 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.249021 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.260329 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.260362 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.260370 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.260383 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.260395 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:21Z","lastTransitionTime":"2026-03-20T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.264607 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.273893 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.294850 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7095e68944d86c6a0389988f9efab836fc063936148500e7d57c08aefbff2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.307405 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.320091 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.332308 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.341530 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.353111 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.362911 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.362946 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.362956 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.362973 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.362984 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:21Z","lastTransitionTime":"2026-03-20T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.384212 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.428666 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.439050 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.449319 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.465232 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.465298 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.465319 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.465344 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.465365 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:21Z","lastTransitionTime":"2026-03-20T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.466160 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.568200 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.568567 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.568588 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.568613 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.568631 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:21Z","lastTransitionTime":"2026-03-20T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.671107 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.671174 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.671193 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.671219 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.671240 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:21Z","lastTransitionTime":"2026-03-20T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.775466 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.775569 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.775589 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.775615 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.775628 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:21Z","lastTransitionTime":"2026-03-20T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.878734 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.878777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.878789 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.878806 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.878817 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:21Z","lastTransitionTime":"2026-03-20T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.934331 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovnkube-controller/0.log" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.938747 4867 generic.go:334] "Generic (PLEG): container finished" podID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerID="f7095e68944d86c6a0389988f9efab836fc063936148500e7d57c08aefbff2d0" exitCode=1 Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.938859 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerDied","Data":"f7095e68944d86c6a0389988f9efab836fc063936148500e7d57c08aefbff2d0"} Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.941813 4867 scope.go:117] "RemoveContainer" containerID="f7095e68944d86c6a0389988f9efab836fc063936148500e7d57c08aefbff2d0" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.949389 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" event={"ID":"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf","Type":"ContainerStarted","Data":"48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df"} Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.959119 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.975742 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:21Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.981118 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.981174 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.981192 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.981216 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:21 crc kubenswrapper[4867]: I0320 00:08:21.981232 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:21Z","lastTransitionTime":"2026-03-20T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.007535 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.027426 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.040073 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.057148 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.075430 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.083586 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.083615 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.083623 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.083637 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.083645 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:22Z","lastTransitionTime":"2026-03-20T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.093736 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.109718 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.123097 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.135876 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.145061 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.163302 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7095e68944d86c6a0389988f9efab836fc063936148500e7d57c08aefbff2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7095e68944d86c6a0389988f9efab836fc063936148500e7d57c08aefbff2d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"message\\\":\\\"y.go:141\\\\nI0320 00:08:21.371545 6723 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 00:08:21.371694 6723 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 00:08:21.371937 6723 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 00:08:21.371951 6723 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 00:08:21.371968 6723 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 00:08:21.371974 6723 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 00:08:21.371997 6723 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 00:08:21.372007 6723 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 00:08:21.372013 6723 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 00:08:21.372024 6723 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 00:08:21.372039 6723 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 00:08:21.372050 6723 factory.go:656] Stopping watch factory\\\\nI0320 00:08:21.372052 6723 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 00:08:21.372053 6723 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 00:08:21.372065 6723 ovnkube.go:599] Stopped ovnkube\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.175040 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.185804 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.185842 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.185853 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.185869 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.185881 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:22Z","lastTransitionTime":"2026-03-20T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.186300 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.201215 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.215440 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.229827 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.243780 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.264990 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.283603 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.287784 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.287832 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.287845 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.287859 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.287868 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:22Z","lastTransitionTime":"2026-03-20T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.299180 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.312994 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.325189 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.335461 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.347157 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.363451 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.381584 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.390251 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.390295 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.390307 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.390323 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.390334 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:22Z","lastTransitionTime":"2026-03-20T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.400042 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.419447 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.421532 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:22 crc kubenswrapper[4867]: E0320 00:08:22.421717 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.421804 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.421896 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:22 crc kubenswrapper[4867]: E0320 00:08:22.422026 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.422066 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:22 crc kubenswrapper[4867]: E0320 00:08:22.422149 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:22 crc kubenswrapper[4867]: E0320 00:08:22.422253 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.435888 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.469748 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7095e68944d86c6a0389988f9efab836fc063936148500e7d57c08aefbff2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7095e68944d86c6a0389988f9efab836fc063936148500e7d57c08aefbff2d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"message\\\":\\\"y.go:141\\\\nI0320 00:08:21.371545 6723 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 00:08:21.371694 6723 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 00:08:21.371937 6723 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 00:08:21.371951 6723 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 00:08:21.371968 6723 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 00:08:21.371974 6723 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 00:08:21.371997 6723 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 00:08:21.372007 6723 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 00:08:21.372013 6723 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 00:08:21.372024 6723 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 00:08:21.372039 6723 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 00:08:21.372050 6723 factory.go:656] Stopping watch factory\\\\nI0320 00:08:21.372052 6723 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 00:08:21.372053 6723 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 00:08:21.372065 6723 ovnkube.go:599] Stopped ovnkube\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.492665 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.492715 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.492729 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.492754 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.492771 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:22Z","lastTransitionTime":"2026-03-20T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.595881 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.595923 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.595934 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.595949 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.595961 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:22Z","lastTransitionTime":"2026-03-20T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.698432 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.698534 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.698552 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.698577 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.698596 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:22Z","lastTransitionTime":"2026-03-20T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.801199 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.801231 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.801239 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.801253 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.801262 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:22Z","lastTransitionTime":"2026-03-20T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.905253 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.905285 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.905295 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.905310 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.905441 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:22Z","lastTransitionTime":"2026-03-20T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.954649 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovnkube-controller/0.log" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.956982 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerStarted","Data":"a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551"} Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.957374 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.975757 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:22 crc kubenswrapper[4867]: I0320 00:08:22.989473 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.001791 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.008140 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.008177 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.008190 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.008207 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.008220 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:23Z","lastTransitionTime":"2026-03-20T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.013656 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.032057 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.046965 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.059265 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.077967 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.091671 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.111476 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.111532 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.111545 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.111564 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.111579 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:23Z","lastTransitionTime":"2026-03-20T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.116986 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7095e68944d86c6a0389988f9efab836fc063936148500e7d57c08aefbff2d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"message\\\":\\\"y.go:141\\\\nI0320 00:08:21.371545 6723 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 00:08:21.371694 6723 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 00:08:21.371937 6723 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 00:08:21.371951 6723 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 00:08:21.371968 6723 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 00:08:21.371974 6723 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 00:08:21.371997 6723 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 00:08:21.372007 6723 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 00:08:21.372013 6723 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 00:08:21.372024 6723 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 00:08:21.372039 6723 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 00:08:21.372050 6723 factory.go:656] Stopping watch factory\\\\nI0320 00:08:21.372052 6723 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 00:08:21.372053 6723 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 00:08:21.372065 6723 ovnkube.go:599] Stopped ovnkube\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.134204 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.152089 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.174819 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.197623 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.214109 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.214177 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.214193 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.214213 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.214226 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:23Z","lastTransitionTime":"2026-03-20T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.215980 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.232752 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.316625 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.316685 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.316702 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.316726 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.316746 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:23Z","lastTransitionTime":"2026-03-20T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.419468 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.419560 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.419579 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.419603 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.419620 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:23Z","lastTransitionTime":"2026-03-20T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.522248 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.522295 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.522307 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.522324 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.522336 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:23Z","lastTransitionTime":"2026-03-20T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.625136 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.625197 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.625213 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.625238 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.625256 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:23Z","lastTransitionTime":"2026-03-20T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.728357 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.728405 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.728418 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.728435 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.728446 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:23Z","lastTransitionTime":"2026-03-20T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.832033 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.832073 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.832081 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.832094 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.832103 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:23Z","lastTransitionTime":"2026-03-20T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.934153 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.934222 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.934246 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.934277 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.934300 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:23Z","lastTransitionTime":"2026-03-20T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.962397 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovnkube-controller/1.log" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.963194 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovnkube-controller/0.log" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.966623 4867 generic.go:334] "Generic (PLEG): container finished" podID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerID="a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551" exitCode=1 Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.966663 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerDied","Data":"a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551"} Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.966698 4867 scope.go:117] "RemoveContainer" containerID="f7095e68944d86c6a0389988f9efab836fc063936148500e7d57c08aefbff2d0" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.967906 4867 scope.go:117] "RemoveContainer" containerID="a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551" Mar 20 00:08:23 crc kubenswrapper[4867]: E0320 00:08:23.968218 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" Mar 20 00:08:23 crc kubenswrapper[4867]: I0320 00:08:23.986850 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.002371 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.018977 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.035060 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.036786 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.036854 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.036875 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.036902 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.036923 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:24Z","lastTransitionTime":"2026-03-20T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.065340 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.081624 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.098151 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.117192 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.131276 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.140225 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.140259 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.140271 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.140288 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.140301 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:24Z","lastTransitionTime":"2026-03-20T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.162326 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7095e68944d86c6a0389988f9efab836fc063936148500e7d57c08aefbff2d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"message\\\":\\\"y.go:141\\\\nI0320 00:08:21.371545 6723 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 00:08:21.371694 6723 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 00:08:21.371937 6723 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 00:08:21.371951 6723 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 00:08:21.371968 6723 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 00:08:21.371974 6723 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 00:08:21.371997 6723 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 00:08:21.372007 6723 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 00:08:21.372013 6723 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 00:08:21.372024 6723 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 00:08:21.372039 6723 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 00:08:21.372050 6723 factory.go:656] Stopping watch factory\\\\nI0320 00:08:21.372052 6723 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 00:08:21.372053 6723 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 00:08:21.372065 6723 ovnkube.go:599] Stopped ovnkube\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:23Z\\\",\\\"message\\\":\\\"luster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 00:08:22.980323 6906 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-operator per-node LB for network=default: []services.LB{}\\\\nI0320 00:08:22.980332 6906 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-operator template LB for network=default: []services.LB{}\\\\nF0320 00:08:22.980337 6906 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.182545 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.196099 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.219062 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.243301 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.243332 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.243341 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.243356 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.243366 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:24Z","lastTransitionTime":"2026-03-20T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.245363 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.270503 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.291991 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.345782 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.345823 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.345834 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.345848 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.345858 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:24Z","lastTransitionTime":"2026-03-20T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.420726 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.420764 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.420764 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:24 crc kubenswrapper[4867]: E0320 00:08:24.420943 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.420992 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:24 crc kubenswrapper[4867]: E0320 00:08:24.421187 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:24 crc kubenswrapper[4867]: E0320 00:08:24.421349 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:24 crc kubenswrapper[4867]: E0320 00:08:24.421483 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.448999 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.449046 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.449064 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.449087 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.449110 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:24Z","lastTransitionTime":"2026-03-20T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.552339 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.552403 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.552423 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.552449 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.552466 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:24Z","lastTransitionTime":"2026-03-20T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.655847 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.655905 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.655921 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.655945 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.655961 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:24Z","lastTransitionTime":"2026-03-20T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.758659 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.758713 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.758729 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.758753 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.758770 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:24Z","lastTransitionTime":"2026-03-20T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.862796 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.862933 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.863007 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.863039 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.863057 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:24Z","lastTransitionTime":"2026-03-20T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.966294 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.967200 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.967391 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.967595 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.967762 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:24Z","lastTransitionTime":"2026-03-20T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.973824 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovnkube-controller/1.log" Mar 20 00:08:24 crc kubenswrapper[4867]: I0320 00:08:24.979565 4867 scope.go:117] "RemoveContainer" containerID="a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551" Mar 20 00:08:24 crc kubenswrapper[4867]: E0320 00:08:24.979828 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.001228 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.021283 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.041181 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.065231 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.072118 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.072373 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.072563 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.072730 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.072857 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:25Z","lastTransitionTime":"2026-03-20T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.083679 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.115658 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:23Z\\\",\\\"message\\\":\\\"luster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 00:08:22.980323 6906 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-operator per-node LB for network=default: []services.LB{}\\\\nI0320 00:08:22.980332 6906 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-operator template LB for network=default: []services.LB{}\\\\nF0320 00:08:22.980337 6906 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.137622 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.157371 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.176911 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.176981 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.176998 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.177024 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.177041 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:25Z","lastTransitionTime":"2026-03-20T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.182089 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.215026 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.236348 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.253014 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.273077 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.282190 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.282252 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.282272 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.282300 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.282321 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:25Z","lastTransitionTime":"2026-03-20T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.298460 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.318474 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.335356 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.385157 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.385221 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.385238 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.385263 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.385284 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:25Z","lastTransitionTime":"2026-03-20T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.488580 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.488667 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.488691 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.488721 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.488744 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:25Z","lastTransitionTime":"2026-03-20T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.591693 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.591748 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.591766 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.591805 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.591821 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:25Z","lastTransitionTime":"2026-03-20T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.694996 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.695055 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.695072 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.695098 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.695115 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:25Z","lastTransitionTime":"2026-03-20T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.798298 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.798352 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.798366 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.798383 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.798395 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:25Z","lastTransitionTime":"2026-03-20T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.902545 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.902603 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.902620 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.902647 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:25 crc kubenswrapper[4867]: I0320 00:08:25.902664 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:25Z","lastTransitionTime":"2026-03-20T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.005744 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.005791 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.005803 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.005820 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.005835 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:26Z","lastTransitionTime":"2026-03-20T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.109542 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.109903 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.110121 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.110351 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.110599 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:26Z","lastTransitionTime":"2026-03-20T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.213864 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.213908 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.213923 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.213944 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.213958 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:26Z","lastTransitionTime":"2026-03-20T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.316656 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.317010 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.317161 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.317310 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.317464 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:26Z","lastTransitionTime":"2026-03-20T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:26 crc kubenswrapper[4867]: E0320 00:08:26.418284 4867 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.420718 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.420797 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.420821 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:26 crc kubenswrapper[4867]: E0320 00:08:26.421083 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.421161 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:26 crc kubenswrapper[4867]: E0320 00:08:26.421283 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:26 crc kubenswrapper[4867]: E0320 00:08:26.421422 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:26 crc kubenswrapper[4867]: E0320 00:08:26.421565 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.435591 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.453765 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.474166 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:26 crc kubenswrapper[4867]: E0320 00:08:26.482749 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.484911 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.507030 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:23Z\\\",\\\"message\\\":\\\"luster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 00:08:22.980323 6906 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-operator per-node LB for network=default: []services.LB{}\\\\nI0320 00:08:22.980332 6906 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-operator template LB for network=default: []services.LB{}\\\\nF0320 00:08:22.980337 6906 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.523540 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.541388 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.559597 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.578020 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.592283 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.630230 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.654098 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.671247 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.690552 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.709513 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:26 crc kubenswrapper[4867]: I0320 00:08:26.732774 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.029587 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.030023 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.030041 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.030067 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.030085 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:27Z","lastTransitionTime":"2026-03-20T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:27 crc kubenswrapper[4867]: E0320 00:08:27.052930 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.058881 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.058988 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.059010 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.059035 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.059053 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:27Z","lastTransitionTime":"2026-03-20T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:27 crc kubenswrapper[4867]: E0320 00:08:27.080353 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.085638 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.085691 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.085708 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.085733 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.085751 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:27Z","lastTransitionTime":"2026-03-20T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:27 crc kubenswrapper[4867]: E0320 00:08:27.113815 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.119573 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.119634 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.119653 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.119677 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.119695 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:27Z","lastTransitionTime":"2026-03-20T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:27 crc kubenswrapper[4867]: E0320 00:08:27.141897 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.146828 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.146869 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.146882 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.146922 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.146934 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:27Z","lastTransitionTime":"2026-03-20T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:27 crc kubenswrapper[4867]: E0320 00:08:27.166366 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:27 crc kubenswrapper[4867]: E0320 00:08:27.166536 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.421562 4867 scope.go:117] "RemoveContainer" containerID="ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.993558 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 00:08:27 crc kubenswrapper[4867]: I0320 00:08:27.996319 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28"} Mar 20 00:08:28 crc kubenswrapper[4867]: I0320 00:08:28.420951 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:28 crc kubenswrapper[4867]: I0320 00:08:28.421060 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:28 crc kubenswrapper[4867]: I0320 00:08:28.421104 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:28 crc kubenswrapper[4867]: I0320 00:08:28.421234 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:28 crc kubenswrapper[4867]: E0320 00:08:28.421223 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:28 crc kubenswrapper[4867]: E0320 00:08:28.421459 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:28 crc kubenswrapper[4867]: E0320 00:08:28.421625 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:28 crc kubenswrapper[4867]: E0320 00:08:28.421744 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:28 crc kubenswrapper[4867]: I0320 00:08:28.998944 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:08:29 crc kubenswrapper[4867]: I0320 00:08:29.012802 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:29 crc kubenswrapper[4867]: I0320 00:08:29.023223 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:29 crc kubenswrapper[4867]: I0320 00:08:29.037626 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:29 crc kubenswrapper[4867]: I0320 00:08:29.059570 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:29 crc kubenswrapper[4867]: I0320 00:08:29.079014 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:29 crc kubenswrapper[4867]: I0320 00:08:29.096748 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:29 crc kubenswrapper[4867]: I0320 00:08:29.110542 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:29 crc kubenswrapper[4867]: I0320 00:08:29.123971 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:29 crc kubenswrapper[4867]: I0320 00:08:29.153123 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:23Z\\\",\\\"message\\\":\\\"luster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 00:08:22.980323 6906 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-operator per-node LB for network=default: []services.LB{}\\\\nI0320 00:08:22.980332 6906 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-operator template LB for network=default: []services.LB{}\\\\nF0320 00:08:22.980337 6906 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:29 crc kubenswrapper[4867]: I0320 00:08:29.169856 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:29 crc kubenswrapper[4867]: I0320 00:08:29.182594 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:29 crc kubenswrapper[4867]: I0320 00:08:29.198097 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:29 crc kubenswrapper[4867]: I0320 00:08:29.206903 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:29 crc kubenswrapper[4867]: I0320 00:08:29.218204 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:29 crc kubenswrapper[4867]: I0320 00:08:29.229211 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:29 crc kubenswrapper[4867]: I0320 00:08:29.240236 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:30 crc kubenswrapper[4867]: I0320 00:08:30.421447 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:30 crc kubenswrapper[4867]: I0320 00:08:30.421480 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:30 crc kubenswrapper[4867]: E0320 00:08:30.421595 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:30 crc kubenswrapper[4867]: I0320 00:08:30.421616 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:30 crc kubenswrapper[4867]: E0320 00:08:30.421664 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:30 crc kubenswrapper[4867]: E0320 00:08:30.421716 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:30 crc kubenswrapper[4867]: I0320 00:08:30.421723 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:30 crc kubenswrapper[4867]: E0320 00:08:30.421868 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:31 crc kubenswrapper[4867]: E0320 00:08:31.484209 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 00:08:32 crc kubenswrapper[4867]: I0320 00:08:32.420834 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:32 crc kubenswrapper[4867]: I0320 00:08:32.420953 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:32 crc kubenswrapper[4867]: I0320 00:08:32.420957 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:32 crc kubenswrapper[4867]: E0320 00:08:32.421036 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:32 crc kubenswrapper[4867]: I0320 00:08:32.421097 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:32 crc kubenswrapper[4867]: E0320 00:08:32.421093 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:32 crc kubenswrapper[4867]: E0320 00:08:32.421258 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:32 crc kubenswrapper[4867]: E0320 00:08:32.421423 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:34 crc kubenswrapper[4867]: I0320 00:08:34.421223 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:34 crc kubenswrapper[4867]: E0320 00:08:34.421419 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:34 crc kubenswrapper[4867]: I0320 00:08:34.421746 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:34 crc kubenswrapper[4867]: I0320 00:08:34.421801 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:34 crc kubenswrapper[4867]: E0320 00:08:34.421863 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:34 crc kubenswrapper[4867]: E0320 00:08:34.421932 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:34 crc kubenswrapper[4867]: I0320 00:08:34.421975 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:34 crc kubenswrapper[4867]: E0320 00:08:34.422039 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:35 crc kubenswrapper[4867]: I0320 00:08:35.542950 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:35 crc kubenswrapper[4867]: E0320 00:08:35.543079 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:35 crc kubenswrapper[4867]: I0320 00:08:35.543190 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:35 crc kubenswrapper[4867]: E0320 00:08:35.543402 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:36 crc kubenswrapper[4867]: I0320 00:08:36.420995 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:36 crc kubenswrapper[4867]: E0320 00:08:36.421369 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:36 crc kubenswrapper[4867]: I0320 00:08:36.421606 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:36 crc kubenswrapper[4867]: E0320 00:08:36.421685 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:36 crc kubenswrapper[4867]: I0320 00:08:36.435348 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:36 crc kubenswrapper[4867]: I0320 00:08:36.446796 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:36 crc kubenswrapper[4867]: I0320 00:08:36.457740 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:36 crc kubenswrapper[4867]: I0320 00:08:36.469608 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:36 crc kubenswrapper[4867]: I0320 00:08:36.477918 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:36 crc kubenswrapper[4867]: E0320 00:08:36.485053 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 00:08:36 crc kubenswrapper[4867]: I0320 00:08:36.494706 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:23Z\\\",\\\"message\\\":\\\"luster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 00:08:22.980323 6906 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-operator per-node LB for network=default: []services.LB{}\\\\nI0320 00:08:22.980332 6906 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-operator template LB for network=default: []services.LB{}\\\\nF0320 00:08:22.980337 6906 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:36 crc kubenswrapper[4867]: I0320 00:08:36.507714 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:36 crc kubenswrapper[4867]: I0320 00:08:36.526174 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:36 crc kubenswrapper[4867]: I0320 00:08:36.541974 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:36 crc kubenswrapper[4867]: I0320 00:08:36.554464 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:36 crc kubenswrapper[4867]: I0320 00:08:36.564847 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:36 crc kubenswrapper[4867]: I0320 00:08:36.592152 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:36 crc kubenswrapper[4867]: I0320 00:08:36.608964 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:36 crc kubenswrapper[4867]: I0320 00:08:36.620974 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:36 crc kubenswrapper[4867]: I0320 00:08:36.635545 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:36 crc kubenswrapper[4867]: I0320 00:08:36.650571 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.399438 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.399468 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.399476 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.399506 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.399516 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:37Z","lastTransitionTime":"2026-03-20T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:37 crc kubenswrapper[4867]: E0320 00:08:37.416427 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.419424 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.419461 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.419469 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.419484 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.419505 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:37Z","lastTransitionTime":"2026-03-20T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.420625 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.420637 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:37 crc kubenswrapper[4867]: E0320 00:08:37.420749 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:37 crc kubenswrapper[4867]: E0320 00:08:37.420835 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:37 crc kubenswrapper[4867]: E0320 00:08:37.429836 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.432550 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.432578 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.432590 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.432604 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.432614 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:37Z","lastTransitionTime":"2026-03-20T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:37 crc kubenswrapper[4867]: E0320 00:08:37.443158 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.445979 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.446005 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.446014 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.446024 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.446049 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:37Z","lastTransitionTime":"2026-03-20T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:37 crc kubenswrapper[4867]: E0320 00:08:37.455762 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.458710 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.458749 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.458760 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.458774 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:37 crc kubenswrapper[4867]: I0320 00:08:37.458783 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:37Z","lastTransitionTime":"2026-03-20T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:37 crc kubenswrapper[4867]: E0320 00:08:37.468266 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:37 crc kubenswrapper[4867]: E0320 00:08:37.468409 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 00:08:38 crc kubenswrapper[4867]: I0320 00:08:38.420982 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:38 crc kubenswrapper[4867]: E0320 00:08:38.421175 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:38 crc kubenswrapper[4867]: I0320 00:08:38.439547 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:38 crc kubenswrapper[4867]: E0320 00:08:38.439724 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.421289 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:39 crc kubenswrapper[4867]: E0320 00:08:39.421434 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.421601 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:39 crc kubenswrapper[4867]: E0320 00:08:39.421846 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.422958 4867 scope.go:117] "RemoveContainer" containerID="a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.456640 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.480061 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.515425 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:23Z\\\",\\\"message\\\":\\\"luster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 00:08:22.980323 6906 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-operator per-node LB for network=default: []services.LB{}\\\\nI0320 00:08:22.980332 6906 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-operator template LB for network=default: []services.LB{}\\\\nF0320 00:08:22.980337 6906 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.533586 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.547931 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.565683 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.583212 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.597532 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.616166 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.630664 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.643212 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.656477 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.667253 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.690052 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.702810 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.711871 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:39 crc kubenswrapper[4867]: I0320 00:08:39.721687 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:39Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.421248 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:40 crc kubenswrapper[4867]: E0320 00:08:40.421366 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.421249 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:40 crc kubenswrapper[4867]: E0320 00:08:40.421633 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.559722 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovnkube-controller/2.log" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.560299 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovnkube-controller/1.log" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.563039 4867 generic.go:334] "Generic (PLEG): container finished" podID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerID="0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3" exitCode=1 Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.563099 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerDied","Data":"0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3"} Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.563143 4867 scope.go:117] "RemoveContainer" containerID="a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.564035 4867 scope.go:117] "RemoveContainer" containerID="0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3" Mar 20 00:08:40 crc kubenswrapper[4867]: E0320 00:08:40.564365 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.577837 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.588567 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.599217 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.621331 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.638203 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.649729 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.662222 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.674053 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.691174 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:23Z\\\",\\\"message\\\":\\\"luster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 00:08:22.980323 6906 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-operator per-node LB for network=default: []services.LB{}\\\\nI0320 00:08:22.980332 6906 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-operator template LB for network=default: []services.LB{}\\\\nF0320 00:08:22.980337 6906 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:40Z\\\",\\\"message\\\":\\\"s \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 00:08:40.255274 7136 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI0320 00:08:40.255292 7136 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 1.451198ms\\\\nI0320 00:08:40.255304 7136 services_controller.go:356] Processing sync for service openshift-marketplace/marketplace-operator-metrics for network=default\\\\nI0320 00:08:40.255307 7136 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0320 00:08:40.255315 7136 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.702661 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.715025 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.731334 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.747390 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.767023 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.780060 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:40 crc kubenswrapper[4867]: I0320 00:08:40.795397 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:41 crc kubenswrapper[4867]: I0320 00:08:41.421635 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:41 crc kubenswrapper[4867]: I0320 00:08:41.421677 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:41 crc kubenswrapper[4867]: E0320 00:08:41.422179 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:41 crc kubenswrapper[4867]: E0320 00:08:41.422282 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:41 crc kubenswrapper[4867]: I0320 00:08:41.435683 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 00:08:41 crc kubenswrapper[4867]: E0320 00:08:41.486669 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 00:08:41 crc kubenswrapper[4867]: I0320 00:08:41.570638 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovnkube-controller/2.log" Mar 20 00:08:42 crc kubenswrapper[4867]: I0320 00:08:42.421484 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:42 crc kubenswrapper[4867]: E0320 00:08:42.421611 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:42 crc kubenswrapper[4867]: I0320 00:08:42.421673 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:42 crc kubenswrapper[4867]: E0320 00:08:42.421859 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:43 crc kubenswrapper[4867]: I0320 00:08:43.420937 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:43 crc kubenswrapper[4867]: I0320 00:08:43.420992 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:43 crc kubenswrapper[4867]: E0320 00:08:43.421137 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:43 crc kubenswrapper[4867]: E0320 00:08:43.421233 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:44 crc kubenswrapper[4867]: I0320 00:08:44.420718 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:44 crc kubenswrapper[4867]: I0320 00:08:44.420798 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:44 crc kubenswrapper[4867]: E0320 00:08:44.420904 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:44 crc kubenswrapper[4867]: E0320 00:08:44.421029 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:45 crc kubenswrapper[4867]: I0320 00:08:45.420679 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:45 crc kubenswrapper[4867]: I0320 00:08:45.420690 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:45 crc kubenswrapper[4867]: E0320 00:08:45.420932 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:45 crc kubenswrapper[4867]: E0320 00:08:45.421140 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.421132 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.421180 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:46 crc kubenswrapper[4867]: E0320 00:08:46.421397 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:46 crc kubenswrapper[4867]: E0320 00:08:46.421612 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.447319 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c574405e-7ef0-45c8-9d24-7f10ea314d86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2409770964a1b705e256edadb4d2dd9515aee220e3fd6e01d066b5fed2c5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fbafffb1e4506f28b3b8343e36bff5d861957c94a7cbd766f3a63969f5d3b54\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:06:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 00:06:28.492141 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 00:06:28.494944 1 observer_polling.go:159] Starting file observer\\\\nI0320 00:06:28.539668 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 00:06:28.544370 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 00:06:54.762267 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 00:06:54.762395 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67b235ca7dadeca35706eb2eb341dc56002a0b2bd99586116914ef6d6e3ee7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58b6faf3a88e38a3ea415a8e7104b72a12234d5ed92a7ff9dbd7ae0067d697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45cb0851df8f9363d5a2c9cc89573235d31c0a9f0773a8531255cfb8381ef24f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.465117 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.477104 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:46 crc kubenswrapper[4867]: E0320 00:08:46.487672 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.489100 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.501864 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.531128 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a536045778813c7b448693d8aaf0dd3600318536a8fd613cbd2788840d72a551\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:23Z\\\",\\\"message\\\":\\\"luster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.183\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 00:08:22.980323 6906 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-operator per-node LB for network=default: []services.LB{}\\\\nI0320 00:08:22.980332 6906 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-operator template LB for network=default: []services.LB{}\\\\nF0320 00:08:22.980337 6906 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:40Z\\\",\\\"message\\\":\\\"s \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 00:08:40.255274 7136 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI0320 00:08:40.255292 7136 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 1.451198ms\\\\nI0320 00:08:40.255304 7136 services_controller.go:356] Processing sync for service openshift-marketplace/marketplace-operator-metrics for network=default\\\\nI0320 00:08:40.255307 7136 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0320 00:08:40.255315 7136 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.546443 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.567832 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.592003 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.615368 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.635474 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.650357 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.663264 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.681881 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.701386 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.719125 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:46 crc kubenswrapper[4867]: I0320 00:08:46.756321 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.421526 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.421567 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:47 crc kubenswrapper[4867]: E0320 00:08:47.422868 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:47 crc kubenswrapper[4867]: E0320 00:08:47.422735 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.862296 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.862399 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.862540 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.862576 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.862636 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:47Z","lastTransitionTime":"2026-03-20T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:47 crc kubenswrapper[4867]: E0320 00:08:47.885998 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.897341 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.897620 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.897784 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.897932 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.898065 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:47Z","lastTransitionTime":"2026-03-20T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:47 crc kubenswrapper[4867]: E0320 00:08:47.920281 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.926040 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.926098 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.926116 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.926142 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.926164 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:47Z","lastTransitionTime":"2026-03-20T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:47 crc kubenswrapper[4867]: E0320 00:08:47.966747 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.974938 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.975179 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.975285 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.975425 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:47 crc kubenswrapper[4867]: I0320 00:08:47.975544 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:47Z","lastTransitionTime":"2026-03-20T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:47 crc kubenswrapper[4867]: E0320 00:08:47.997213 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:48 crc kubenswrapper[4867]: I0320 00:08:48.001442 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:48 crc kubenswrapper[4867]: I0320 00:08:48.001570 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:48 crc kubenswrapper[4867]: I0320 00:08:48.001633 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:48 crc kubenswrapper[4867]: I0320 00:08:48.001701 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:48 crc kubenswrapper[4867]: I0320 00:08:48.001763 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:48Z","lastTransitionTime":"2026-03-20T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:48 crc kubenswrapper[4867]: E0320 00:08:48.013370 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:48 crc kubenswrapper[4867]: E0320 00:08:48.013630 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 00:08:48 crc kubenswrapper[4867]: I0320 00:08:48.421405 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:48 crc kubenswrapper[4867]: I0320 00:08:48.421560 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:48 crc kubenswrapper[4867]: E0320 00:08:48.421643 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:48 crc kubenswrapper[4867]: E0320 00:08:48.421791 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:48 crc kubenswrapper[4867]: I0320 00:08:48.866016 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:08:48 crc kubenswrapper[4867]: I0320 00:08:48.867624 4867 scope.go:117] "RemoveContainer" containerID="0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3" Mar 20 00:08:48 crc kubenswrapper[4867]: E0320 00:08:48.867952 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" Mar 20 00:08:48 crc kubenswrapper[4867]: I0320 00:08:48.888845 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:48 crc kubenswrapper[4867]: I0320 00:08:48.908580 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:48 crc kubenswrapper[4867]: I0320 00:08:48.928400 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:48 crc kubenswrapper[4867]: I0320 00:08:48.946678 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:48 crc kubenswrapper[4867]: I0320 00:08:48.980586 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:40Z\\\",\\\"message\\\":\\\"s \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 00:08:40.255274 7136 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI0320 00:08:40.255292 7136 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 1.451198ms\\\\nI0320 00:08:40.255304 7136 services_controller.go:356] Processing sync for service openshift-marketplace/marketplace-operator-metrics for network=default\\\\nI0320 00:08:40.255307 7136 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0320 00:08:40.255315 7136 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:49 crc kubenswrapper[4867]: I0320 00:08:49.000026 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:49 crc kubenswrapper[4867]: I0320 00:08:49.020560 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:49 crc kubenswrapper[4867]: I0320 00:08:49.033836 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:49 crc kubenswrapper[4867]: I0320 00:08:49.045897 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:49 crc kubenswrapper[4867]: I0320 00:08:49.060010 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:49 crc kubenswrapper[4867]: I0320 00:08:49.094025 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:49 crc kubenswrapper[4867]: I0320 00:08:49.109846 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:49 crc kubenswrapper[4867]: I0320 00:08:49.123361 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:49 crc kubenswrapper[4867]: I0320 00:08:49.138544 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:49 crc kubenswrapper[4867]: I0320 00:08:49.153163 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:49 crc kubenswrapper[4867]: I0320 00:08:49.178324 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c574405e-7ef0-45c8-9d24-7f10ea314d86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2409770964a1b705e256edadb4d2dd9515aee220e3fd6e01d066b5fed2c5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fbafffb1e4506f28b3b8343e36bff5d861957c94a7cbd766f3a63969f5d3b54\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:06:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 00:06:28.492141 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 00:06:28.494944 1 observer_polling.go:159] Starting file observer\\\\nI0320 00:06:28.539668 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 00:06:28.544370 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 00:06:54.762267 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 00:06:54.762395 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67b235ca7dadeca35706eb2eb341dc56002a0b2bd99586116914ef6d6e3ee7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58b6faf3a88e38a3ea415a8e7104b72a12234d5ed92a7ff9dbd7ae0067d697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45cb0851df8f9363d5a2c9cc89573235d31c0a9f0773a8531255cfb8381ef24f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:49 crc kubenswrapper[4867]: I0320 00:08:49.191256 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:49 crc kubenswrapper[4867]: I0320 00:08:49.421069 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:49 crc kubenswrapper[4867]: I0320 00:08:49.421096 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:49 crc kubenswrapper[4867]: E0320 00:08:49.421320 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:49 crc kubenswrapper[4867]: E0320 00:08:49.421566 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:50 crc kubenswrapper[4867]: I0320 00:08:50.421327 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:50 crc kubenswrapper[4867]: E0320 00:08:50.421471 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:50 crc kubenswrapper[4867]: I0320 00:08:50.421905 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:50 crc kubenswrapper[4867]: E0320 00:08:50.422007 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:51 crc kubenswrapper[4867]: I0320 00:08:51.421576 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:51 crc kubenswrapper[4867]: I0320 00:08:51.421589 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:51 crc kubenswrapper[4867]: E0320 00:08:51.421764 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:51 crc kubenswrapper[4867]: E0320 00:08:51.421938 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:51 crc kubenswrapper[4867]: E0320 00:08:51.489159 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 00:08:52 crc kubenswrapper[4867]: I0320 00:08:52.346035 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:08:52 crc kubenswrapper[4867]: E0320 00:08:52.346229 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:09:56.346188309 +0000 UTC m=+210.572725866 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:08:52 crc kubenswrapper[4867]: I0320 00:08:52.346622 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:52 crc kubenswrapper[4867]: I0320 00:08:52.346717 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:52 crc kubenswrapper[4867]: I0320 00:08:52.346823 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:52 crc kubenswrapper[4867]: E0320 00:08:52.346823 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 00:08:52 crc kubenswrapper[4867]: E0320 00:08:52.346905 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 00:08:52 crc kubenswrapper[4867]: E0320 00:08:52.346914 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 00:09:56.346898577 +0000 UTC m=+210.573436124 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 00:08:52 crc kubenswrapper[4867]: E0320 00:08:52.346987 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 00:09:56.346969459 +0000 UTC m=+210.573507006 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 00:08:52 crc kubenswrapper[4867]: E0320 00:08:52.347074 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 00:08:52 crc kubenswrapper[4867]: E0320 00:08:52.347095 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 00:08:52 crc kubenswrapper[4867]: E0320 00:08:52.347114 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:08:52 crc kubenswrapper[4867]: E0320 00:08:52.347157 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 00:09:56.347144034 +0000 UTC m=+210.573681591 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:08:52 crc kubenswrapper[4867]: I0320 00:08:52.421387 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:52 crc kubenswrapper[4867]: E0320 00:08:52.421557 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:52 crc kubenswrapper[4867]: I0320 00:08:52.421609 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:52 crc kubenswrapper[4867]: E0320 00:08:52.422062 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:52 crc kubenswrapper[4867]: I0320 00:08:52.438074 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 00:08:52 crc kubenswrapper[4867]: I0320 00:08:52.447448 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs\") pod \"network-metrics-daemon-rkq8h\" (UID: \"0e040dc6-20c6-4d82-b719-bf25fa43db67\") " pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:52 crc kubenswrapper[4867]: I0320 00:08:52.447604 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:52 crc kubenswrapper[4867]: E0320 00:08:52.447687 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 00:08:52 crc kubenswrapper[4867]: E0320 00:08:52.447810 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 00:08:52 crc kubenswrapper[4867]: E0320 00:08:52.447868 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 00:08:52 crc kubenswrapper[4867]: E0320 00:08:52.447889 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:08:52 crc kubenswrapper[4867]: E0320 00:08:52.447914 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs podName:0e040dc6-20c6-4d82-b719-bf25fa43db67 nodeName:}" failed. No retries permitted until 2026-03-20 00:09:56.447877576 +0000 UTC m=+210.674415173 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs") pod "network-metrics-daemon-rkq8h" (UID: "0e040dc6-20c6-4d82-b719-bf25fa43db67") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 00:08:52 crc kubenswrapper[4867]: E0320 00:08:52.447951 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 00:09:56.447932128 +0000 UTC m=+210.674469685 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 00:08:53 crc kubenswrapper[4867]: I0320 00:08:53.420644 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:53 crc kubenswrapper[4867]: I0320 00:08:53.420697 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:53 crc kubenswrapper[4867]: E0320 00:08:53.420770 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:53 crc kubenswrapper[4867]: E0320 00:08:53.420989 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:54 crc kubenswrapper[4867]: I0320 00:08:54.420994 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:54 crc kubenswrapper[4867]: I0320 00:08:54.421157 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:54 crc kubenswrapper[4867]: E0320 00:08:54.421366 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:54 crc kubenswrapper[4867]: E0320 00:08:54.421549 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:55 crc kubenswrapper[4867]: I0320 00:08:55.421594 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:55 crc kubenswrapper[4867]: I0320 00:08:55.421657 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:55 crc kubenswrapper[4867]: E0320 00:08:55.421762 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:55 crc kubenswrapper[4867]: E0320 00:08:55.421978 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.420954 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.420966 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:56 crc kubenswrapper[4867]: E0320 00:08:56.421134 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:56 crc kubenswrapper[4867]: E0320 00:08:56.421270 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.443009 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.463266 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.484899 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:56 crc kubenswrapper[4867]: E0320 00:08:56.490481 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.501936 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5250e888-f67f-4ae9-8756-378e6308259a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e979da43d57b624616e0ebca87477cf565c87239ad02d1f0880a86c79bca523b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76e241ad3f91b9521298faebb9a60d29fd246e4589999ab7ab5395a3ef13c15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5b9624aaef5b4563371e16f176d45ac03c847e262f8a4775493a3e69313796\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.530474 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.544023 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.552937 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.564871 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.578662 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.594885 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.607544 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.623025 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c574405e-7ef0-45c8-9d24-7f10ea314d86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2409770964a1b705e256edadb4d2dd9515aee220e3fd6e01d066b5fed2c5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fbafffb1e4506f28b3b8343e36bff5d861957c94a7cbd766f3a63969f5d3b54\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:06:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 00:06:28.492141 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 00:06:28.494944 1 observer_polling.go:159] Starting file observer\\\\nI0320 00:06:28.539668 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 00:06:28.544370 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 00:06:54.762267 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 00:06:54.762395 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67b235ca7dadeca35706eb2eb341dc56002a0b2bd99586116914ef6d6e3ee7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58b6faf3a88e38a3ea415a8e7104b72a12234d5ed92a7ff9dbd7ae0067d697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45cb0851df8f9363d5a2c9cc89573235d31c0a9f0773a8531255cfb8381ef24f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.644960 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.664088 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.679259 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.696398 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.711989 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:56 crc kubenswrapper[4867]: I0320 00:08:56.745004 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:40Z\\\",\\\"message\\\":\\\"s \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 00:08:40.255274 7136 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI0320 00:08:40.255292 7136 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 1.451198ms\\\\nI0320 00:08:40.255304 7136 services_controller.go:356] Processing sync for service openshift-marketplace/marketplace-operator-metrics for network=default\\\\nI0320 00:08:40.255307 7136 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0320 00:08:40.255315 7136 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:57 crc kubenswrapper[4867]: I0320 00:08:57.420951 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:57 crc kubenswrapper[4867]: I0320 00:08:57.420989 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:57 crc kubenswrapper[4867]: E0320 00:08:57.421176 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:57 crc kubenswrapper[4867]: E0320 00:08:57.421360 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.184519 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.184585 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.184604 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.184626 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.184640 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:58Z","lastTransitionTime":"2026-03-20T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:58 crc kubenswrapper[4867]: E0320 00:08:58.206350 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.211961 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.212034 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.212059 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.212085 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.212103 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:58Z","lastTransitionTime":"2026-03-20T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:58 crc kubenswrapper[4867]: E0320 00:08:58.233323 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.238710 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.238743 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.238751 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.238765 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.238778 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:58Z","lastTransitionTime":"2026-03-20T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:58 crc kubenswrapper[4867]: E0320 00:08:58.258384 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.262703 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.262782 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.262806 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.262838 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.262935 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:58Z","lastTransitionTime":"2026-03-20T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:58 crc kubenswrapper[4867]: E0320 00:08:58.281018 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.286366 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.286452 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.286470 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.286529 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.286549 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:08:58Z","lastTransitionTime":"2026-03-20T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:08:58 crc kubenswrapper[4867]: E0320 00:08:58.307750 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 20 00:08:58 crc kubenswrapper[4867]: E0320 00:08:58.308075 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.421483 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:08:58 crc kubenswrapper[4867]: E0320 00:08:58.421736 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.421772 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:08:58 crc kubenswrapper[4867]: E0320 00:08:58.421941 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:08:58 crc kubenswrapper[4867]: I0320 00:08:58.437277 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 00:08:59 crc kubenswrapper[4867]: I0320 00:08:59.420640 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:08:59 crc kubenswrapper[4867]: E0320 00:08:59.421776 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:08:59 crc kubenswrapper[4867]: I0320 00:08:59.420727 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:08:59 crc kubenswrapper[4867]: E0320 00:08:59.421935 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:00 crc kubenswrapper[4867]: I0320 00:09:00.421593 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:00 crc kubenswrapper[4867]: I0320 00:09:00.421616 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:00 crc kubenswrapper[4867]: E0320 00:09:00.421839 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:00 crc kubenswrapper[4867]: E0320 00:09:00.422004 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:01 crc kubenswrapper[4867]: I0320 00:09:01.421512 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:01 crc kubenswrapper[4867]: I0320 00:09:01.421620 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:01 crc kubenswrapper[4867]: E0320 00:09:01.421654 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:01 crc kubenswrapper[4867]: E0320 00:09:01.421823 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:01 crc kubenswrapper[4867]: E0320 00:09:01.492357 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 00:09:02 crc kubenswrapper[4867]: I0320 00:09:02.421328 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:02 crc kubenswrapper[4867]: I0320 00:09:02.421363 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:02 crc kubenswrapper[4867]: E0320 00:09:02.421548 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:02 crc kubenswrapper[4867]: E0320 00:09:02.421673 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.421175 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.421175 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:03 crc kubenswrapper[4867]: E0320 00:09:03.421382 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:03 crc kubenswrapper[4867]: E0320 00:09:03.421570 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.655201 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-98n2n_97e52c03-2ca5-4cad-8459-f03029234544/kube-multus/0.log" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.655285 4867 generic.go:334] "Generic (PLEG): container finished" podID="97e52c03-2ca5-4cad-8459-f03029234544" containerID="b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab" exitCode=1 Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.655324 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-98n2n" event={"ID":"97e52c03-2ca5-4cad-8459-f03029234544","Type":"ContainerDied","Data":"b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab"} Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.655880 4867 scope.go:117] "RemoveContainer" containerID="b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.672663 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:03Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.697653 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:03Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.708791 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:03Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.726219 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:40Z\\\",\\\"message\\\":\\\"s \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 00:08:40.255274 7136 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI0320 00:08:40.255292 7136 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 1.451198ms\\\\nI0320 00:08:40.255304 7136 services_controller.go:356] Processing sync for service openshift-marketplace/marketplace-operator-metrics for network=default\\\\nI0320 00:08:40.255307 7136 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0320 00:08:40.255315 7136 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:03Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.739485 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6186c95-4fd8-4156-ab1b-a66356d62628\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe3cdc0404a20b6a3a2483c36b65adabadcafd16f543494fabaee167f674d23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71047bdfc9aeda57b4816fead48322cf357e88d97a9ee4303b936bebaad38026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71047bdfc9aeda57b4816fead48322cf357e88d97a9ee4303b936bebaad38026\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:03Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.755210 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:03Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.772116 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:03Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.800910 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:03Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.820339 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:03Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.835640 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:03Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.849326 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:03Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.867277 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:03Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.883568 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:03Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.899759 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:03Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.914204 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:03Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.929577 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5250e888-f67f-4ae9-8756-378e6308259a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e979da43d57b624616e0ebca87477cf565c87239ad02d1f0880a86c79bca523b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76e241ad3f91b9521298faebb9a60d29fd246e4589999ab7ab5395a3ef13c15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5b9624aaef5b4563371e16f176d45ac03c847e262f8a4775493a3e69313796\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:03Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.955385 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:03Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:03 crc kubenswrapper[4867]: I0320 00:09:03.978624 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c574405e-7ef0-45c8-9d24-7f10ea314d86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2409770964a1b705e256edadb4d2dd9515aee220e3fd6e01d066b5fed2c5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fbafffb1e4506f28b3b8343e36bff5d861957c94a7cbd766f3a63969f5d3b54\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:06:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 00:06:28.492141 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 00:06:28.494944 1 observer_polling.go:159] Starting file observer\\\\nI0320 00:06:28.539668 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 00:06:28.544370 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 00:06:54.762267 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 00:06:54.762395 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67b235ca7dadeca35706eb2eb341dc56002a0b2bd99586116914ef6d6e3ee7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58b6faf3a88e38a3ea415a8e7104b72a12234d5ed92a7ff9dbd7ae0067d697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45cb0851df8f9363d5a2c9cc89573235d31c0a9f0773a8531255cfb8381ef24f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:03Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.003794 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:09:03Z\\\",\\\"message\\\":\\\"2026-03-20T00:08:17+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16dc134d-e38a-4b8e-8261-c57fab0b7630\\\\n2026-03-20T00:08:17+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16dc134d-e38a-4b8e-8261-c57fab0b7630 to /host/opt/cni/bin/\\\\n2026-03-20T00:08:18Z [verbose] multus-daemon started\\\\n2026-03-20T00:08:18Z [verbose] Readiness Indicator file check\\\\n2026-03-20T00:09:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.421208 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:04 crc kubenswrapper[4867]: E0320 00:09:04.421393 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.422350 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:04 crc kubenswrapper[4867]: E0320 00:09:04.422576 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.422727 4867 scope.go:117] "RemoveContainer" containerID="0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.662655 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovnkube-controller/2.log" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.667443 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerStarted","Data":"f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94"} Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.668114 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.671139 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-98n2n_97e52c03-2ca5-4cad-8459-f03029234544/kube-multus/0.log" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.671226 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-98n2n" event={"ID":"97e52c03-2ca5-4cad-8459-f03029234544","Type":"ContainerStarted","Data":"94ac4d4ff5d940e0e8e7fb098074879aa2d41ab5b5b0e51d81d50801b1ff8b3f"} Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.684364 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c574405e-7ef0-45c8-9d24-7f10ea314d86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2409770964a1b705e256edadb4d2dd9515aee220e3fd6e01d066b5fed2c5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fbafffb1e4506f28b3b8343e36bff5d861957c94a7cbd766f3a63969f5d3b54\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:06:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 00:06:28.492141 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 00:06:28.494944 1 observer_polling.go:159] Starting file observer\\\\nI0320 00:06:28.539668 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 00:06:28.544370 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 00:06:54.762267 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 00:06:54.762395 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67b235ca7dadeca35706eb2eb341dc56002a0b2bd99586116914ef6d6e3ee7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58b6faf3a88e38a3ea415a8e7104b72a12234d5ed92a7ff9dbd7ae0067d697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45cb0851df8f9363d5a2c9cc89573235d31c0a9f0773a8531255cfb8381ef24f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.704976 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:09:03Z\\\",\\\"message\\\":\\\"2026-03-20T00:08:17+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16dc134d-e38a-4b8e-8261-c57fab0b7630\\\\n2026-03-20T00:08:17+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16dc134d-e38a-4b8e-8261-c57fab0b7630 to /host/opt/cni/bin/\\\\n2026-03-20T00:08:18Z [verbose] multus-daemon started\\\\n2026-03-20T00:08:18Z [verbose] Readiness Indicator file check\\\\n2026-03-20T00:09:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.722151 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.738925 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:40Z\\\",\\\"message\\\":\\\"s \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 00:08:40.255274 7136 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI0320 00:08:40.255292 7136 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 1.451198ms\\\\nI0320 00:08:40.255304 7136 services_controller.go:356] Processing sync for service openshift-marketplace/marketplace-operator-metrics for network=default\\\\nI0320 00:08:40.255307 7136 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0320 00:08:40.255315 7136 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:09:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.748688 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6186c95-4fd8-4156-ab1b-a66356d62628\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe3cdc0404a20b6a3a2483c36b65adabadcafd16f543494fabaee167f674d23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71047bdfc9aeda57b4816fead48322cf357e88d97a9ee4303b936bebaad38026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71047bdfc9aeda57b4816fead48322cf357e88d97a9ee4303b936bebaad38026\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.759934 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.784002 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.803029 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.821708 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.840359 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.856776 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.872251 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.888640 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.904821 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.913890 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.923443 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5250e888-f67f-4ae9-8756-378e6308259a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e979da43d57b624616e0ebca87477cf565c87239ad02d1f0880a86c79bca523b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76e241ad3f91b9521298faebb9a60d29fd246e4589999ab7ab5395a3ef13c15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5b9624aaef5b4563371e16f176d45ac03c847e262f8a4775493a3e69313796\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.940051 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.951256 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.959843 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.975854 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:04 crc kubenswrapper[4867]: I0320 00:09:04.993001 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.007135 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.019746 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.033608 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.047312 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.057738 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.073830 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5250e888-f67f-4ae9-8756-378e6308259a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e979da43d57b624616e0ebca87477cf565c87239ad02d1f0880a86c79bca523b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76e241ad3f91b9521298faebb9a60d29fd246e4589999ab7ab5395a3ef13c15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5b9624aaef5b4563371e16f176d45ac03c847e262f8a4775493a3e69313796\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.092524 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.108747 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.116881 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.128420 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c574405e-7ef0-45c8-9d24-7f10ea314d86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2409770964a1b705e256edadb4d2dd9515aee220e3fd6e01d066b5fed2c5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fbafffb1e4506f28b3b8343e36bff5d861957c94a7cbd766f3a63969f5d3b54\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:06:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 00:06:28.492141 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 00:06:28.494944 1 observer_polling.go:159] Starting file observer\\\\nI0320 00:06:28.539668 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 00:06:28.544370 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 00:06:54.762267 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 00:06:54.762395 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67b235ca7dadeca35706eb2eb341dc56002a0b2bd99586116914ef6d6e3ee7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58b6faf3a88e38a3ea415a8e7104b72a12234d5ed92a7ff9dbd7ae0067d697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45cb0851df8f9363d5a2c9cc89573235d31c0a9f0773a8531255cfb8381ef24f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.146423 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ac4d4ff5d940e0e8e7fb098074879aa2d41ab5b5b0e51d81d50801b1ff8b3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:09:03Z\\\",\\\"message\\\":\\\"2026-03-20T00:08:17+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16dc134d-e38a-4b8e-8261-c57fab0b7630\\\\n2026-03-20T00:08:17+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16dc134d-e38a-4b8e-8261-c57fab0b7630 to /host/opt/cni/bin/\\\\n2026-03-20T00:08:18Z [verbose] multus-daemon started\\\\n2026-03-20T00:08:18Z [verbose] Readiness Indicator file check\\\\n2026-03-20T00:09:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:09:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.157111 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.173690 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:40Z\\\",\\\"message\\\":\\\"s \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 00:08:40.255274 7136 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI0320 00:08:40.255292 7136 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 1.451198ms\\\\nI0320 00:08:40.255304 7136 services_controller.go:356] Processing sync for service openshift-marketplace/marketplace-operator-metrics for network=default\\\\nI0320 00:08:40.255307 7136 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0320 00:08:40.255315 7136 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:09:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.183301 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6186c95-4fd8-4156-ab1b-a66356d62628\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe3cdc0404a20b6a3a2483c36b65adabadcafd16f543494fabaee167f674d23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71047bdfc9aeda57b4816fead48322cf357e88d97a9ee4303b936bebaad38026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71047bdfc9aeda57b4816fead48322cf357e88d97a9ee4303b936bebaad38026\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.194552 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.206348 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.224405 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.421424 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.421639 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:05 crc kubenswrapper[4867]: E0320 00:09:05.421824 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:05 crc kubenswrapper[4867]: E0320 00:09:05.422067 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.676971 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovnkube-controller/3.log" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.677889 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovnkube-controller/2.log" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.682227 4867 generic.go:334] "Generic (PLEG): container finished" podID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerID="f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94" exitCode=1 Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.682320 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerDied","Data":"f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94"} Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.682472 4867 scope.go:117] "RemoveContainer" containerID="0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.683685 4867 scope.go:117] "RemoveContainer" containerID="f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94" Mar 20 00:09:05 crc kubenswrapper[4867]: E0320 00:09:05.684263 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.702654 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6186c95-4fd8-4156-ab1b-a66356d62628\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe3cdc0404a20b6a3a2483c36b65adabadcafd16f543494fabaee167f674d23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71047bdfc9aeda57b4816fead48322cf357e88d97a9ee4303b936bebaad38026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71047bdfc9aeda57b4816fead48322cf357e88d97a9ee4303b936bebaad38026\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.729441 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.748992 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.771795 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.788662 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.825744 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:40Z\\\",\\\"message\\\":\\\"s \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 00:08:40.255274 7136 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI0320 00:08:40.255292 7136 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 1.451198ms\\\\nI0320 00:08:40.255304 7136 services_controller.go:356] Processing sync for service openshift-marketplace/marketplace-operator-metrics for network=default\\\\nI0320 00:08:40.255307 7136 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0320 00:08:40.255315 7136 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:09:05Z\\\",\\\"message\\\":\\\"0:09:05.276347 7426 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 00:09:05.275979 7426 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:09:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.846297 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.866457 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.886843 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.905189 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5250e888-f67f-4ae9-8756-378e6308259a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e979da43d57b624616e0ebca87477cf565c87239ad02d1f0880a86c79bca523b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76e241ad3f91b9521298faebb9a60d29fd246e4589999ab7ab5395a3ef13c15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5b9624aaef5b4563371e16f176d45ac03c847e262f8a4775493a3e69313796\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.935861 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.953995 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.965721 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.982549 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:05 crc kubenswrapper[4867]: I0320 00:09:05.999035 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.011289 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.024485 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.041348 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c574405e-7ef0-45c8-9d24-7f10ea314d86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2409770964a1b705e256edadb4d2dd9515aee220e3fd6e01d066b5fed2c5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fbafffb1e4506f28b3b8343e36bff5d861957c94a7cbd766f3a63969f5d3b54\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:06:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 00:06:28.492141 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 00:06:28.494944 1 observer_polling.go:159] Starting file observer\\\\nI0320 00:06:28.539668 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 00:06:28.544370 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 00:06:54.762267 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 00:06:54.762395 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67b235ca7dadeca35706eb2eb341dc56002a0b2bd99586116914ef6d6e3ee7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58b6faf3a88e38a3ea415a8e7104b72a12234d5ed92a7ff9dbd7ae0067d697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45cb0851df8f9363d5a2c9cc89573235d31c0a9f0773a8531255cfb8381ef24f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.059714 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ac4d4ff5d940e0e8e7fb098074879aa2d41ab5b5b0e51d81d50801b1ff8b3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:09:03Z\\\",\\\"message\\\":\\\"2026-03-20T00:08:17+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16dc134d-e38a-4b8e-8261-c57fab0b7630\\\\n2026-03-20T00:08:17+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16dc134d-e38a-4b8e-8261-c57fab0b7630 to /host/opt/cni/bin/\\\\n2026-03-20T00:08:18Z [verbose] multus-daemon started\\\\n2026-03-20T00:08:18Z [verbose] Readiness Indicator file check\\\\n2026-03-20T00:09:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:09:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.421462 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.421559 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:06 crc kubenswrapper[4867]: E0320 00:09:06.421692 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:06 crc kubenswrapper[4867]: E0320 00:09:06.421842 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.440370 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.472319 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba4981550e2e1374231bd0049fe5d1d959fb0fe4c6cee96108ef21e704725f3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:08:40Z\\\",\\\"message\\\":\\\"s \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 00:08:40.255274 7136 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI0320 00:08:40.255292 7136 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 1.451198ms\\\\nI0320 00:08:40.255304 7136 services_controller.go:356] Processing sync for service openshift-marketplace/marketplace-operator-metrics for network=default\\\\nI0320 00:08:40.255307 7136 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0320 00:08:40.255315 7136 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:09:05Z\\\",\\\"message\\\":\\\"0:09:05.276347 7426 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 00:09:05.275979 7426 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:09:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.489430 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6186c95-4fd8-4156-ab1b-a66356d62628\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe3cdc0404a20b6a3a2483c36b65adabadcafd16f543494fabaee167f674d23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71047bdfc9aeda57b4816fead48322cf357e88d97a9ee4303b936bebaad38026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71047bdfc9aeda57b4816fead48322cf357e88d97a9ee4303b936bebaad38026\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: E0320 00:09:06.493030 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.508428 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.530199 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.553471 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.575924 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.599308 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.620462 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.647871 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.663471 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.679794 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.688677 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovnkube-controller/3.log" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.694189 4867 scope.go:117] "RemoveContainer" containerID="f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94" Mar 20 00:09:06 crc kubenswrapper[4867]: E0320 00:09:06.694559 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.696904 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.715711 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5250e888-f67f-4ae9-8756-378e6308259a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e979da43d57b624616e0ebca87477cf565c87239ad02d1f0880a86c79bca523b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76e241ad3f91b9521298faebb9a60d29fd246e4589999ab7ab5395a3ef13c15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5b9624aaef5b4563371e16f176d45ac03c847e262f8a4775493a3e69313796\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.745636 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.763940 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.776805 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.792824 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c574405e-7ef0-45c8-9d24-7f10ea314d86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2409770964a1b705e256edadb4d2dd9515aee220e3fd6e01d066b5fed2c5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fbafffb1e4506f28b3b8343e36bff5d861957c94a7cbd766f3a63969f5d3b54\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:06:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 00:06:28.492141 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 00:06:28.494944 1 observer_polling.go:159] Starting file observer\\\\nI0320 00:06:28.539668 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 00:06:28.544370 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 00:06:54.762267 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 00:06:54.762395 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67b235ca7dadeca35706eb2eb341dc56002a0b2bd99586116914ef6d6e3ee7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58b6faf3a88e38a3ea415a8e7104b72a12234d5ed92a7ff9dbd7ae0067d697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45cb0851df8f9363d5a2c9cc89573235d31c0a9f0773a8531255cfb8381ef24f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.808443 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ac4d4ff5d940e0e8e7fb098074879aa2d41ab5b5b0e51d81d50801b1ff8b3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:09:03Z\\\",\\\"message\\\":\\\"2026-03-20T00:08:17+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16dc134d-e38a-4b8e-8261-c57fab0b7630\\\\n2026-03-20T00:08:17+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16dc134d-e38a-4b8e-8261-c57fab0b7630 to /host/opt/cni/bin/\\\\n2026-03-20T00:08:18Z [verbose] multus-daemon started\\\\n2026-03-20T00:08:18Z [verbose] Readiness Indicator file check\\\\n2026-03-20T00:09:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:09:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.821687 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c574405e-7ef0-45c8-9d24-7f10ea314d86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2409770964a1b705e256edadb4d2dd9515aee220e3fd6e01d066b5fed2c5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fbafffb1e4506f28b3b8343e36bff5d861957c94a7cbd766f3a63969f5d3b54\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:06:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 00:06:28.492141 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 00:06:28.494944 1 observer_polling.go:159] Starting file observer\\\\nI0320 00:06:28.539668 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 00:06:28.544370 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 00:06:54.762267 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 00:06:54.762395 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67b235ca7dadeca35706eb2eb341dc56002a0b2bd99586116914ef6d6e3ee7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58b6faf3a88e38a3ea415a8e7104b72a12234d5ed92a7ff9dbd7ae0067d697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45cb0851df8f9363d5a2c9cc89573235d31c0a9f0773a8531255cfb8381ef24f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.840159 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ac4d4ff5d940e0e8e7fb098074879aa2d41ab5b5b0e51d81d50801b1ff8b3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:09:03Z\\\",\\\"message\\\":\\\"2026-03-20T00:08:17+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16dc134d-e38a-4b8e-8261-c57fab0b7630\\\\n2026-03-20T00:08:17+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16dc134d-e38a-4b8e-8261-c57fab0b7630 to /host/opt/cni/bin/\\\\n2026-03-20T00:08:18Z [verbose] multus-daemon started\\\\n2026-03-20T00:08:18Z [verbose] Readiness Indicator file check\\\\n2026-03-20T00:09:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:09:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.856897 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:09:05Z\\\",\\\"message\\\":\\\"0:09:05.276347 7426 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 00:09:05.275979 7426 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:09:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.867065 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6186c95-4fd8-4156-ab1b-a66356d62628\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe3cdc0404a20b6a3a2483c36b65adabadcafd16f543494fabaee167f674d23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71047bdfc9aeda57b4816fead48322cf357e88d97a9ee4303b936bebaad38026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71047bdfc9aeda57b4816fead48322cf357e88d97a9ee4303b936bebaad38026\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.880399 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.896704 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.916807 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.930892 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.944051 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.957169 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.969291 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.981801 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:06 crc kubenswrapper[4867]: I0320 00:09:06.996140 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:07 crc kubenswrapper[4867]: I0320 00:09:07.009718 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:07 crc kubenswrapper[4867]: I0320 00:09:07.025592 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5250e888-f67f-4ae9-8756-378e6308259a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e979da43d57b624616e0ebca87477cf565c87239ad02d1f0880a86c79bca523b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76e241ad3f91b9521298faebb9a60d29fd246e4589999ab7ab5395a3ef13c15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5b9624aaef5b4563371e16f176d45ac03c847e262f8a4775493a3e69313796\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:07 crc kubenswrapper[4867]: I0320 00:09:07.044984 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:07 crc kubenswrapper[4867]: I0320 00:09:07.062657 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:07 crc kubenswrapper[4867]: I0320 00:09:07.073943 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:07 crc kubenswrapper[4867]: I0320 00:09:07.085324 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:07Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:07 crc kubenswrapper[4867]: I0320 00:09:07.421305 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:07 crc kubenswrapper[4867]: I0320 00:09:07.421320 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:07 crc kubenswrapper[4867]: E0320 00:09:07.421488 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:07 crc kubenswrapper[4867]: E0320 00:09:07.421652 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.421082 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.421189 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:08 crc kubenswrapper[4867]: E0320 00:09:08.421315 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:08 crc kubenswrapper[4867]: E0320 00:09:08.421458 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.422344 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.422398 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.422410 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.422429 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.422443 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:09:08Z","lastTransitionTime":"2026-03-20T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:09:08 crc kubenswrapper[4867]: E0320 00:09:08.436328 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.443188 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.443259 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.443279 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.443313 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.443331 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:09:08Z","lastTransitionTime":"2026-03-20T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:09:08 crc kubenswrapper[4867]: E0320 00:09:08.464266 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.468301 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.468335 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.468343 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.468357 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.468365 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:09:08Z","lastTransitionTime":"2026-03-20T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:09:08 crc kubenswrapper[4867]: E0320 00:09:08.485742 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.490693 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.490733 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.490745 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.490761 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.490771 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:09:08Z","lastTransitionTime":"2026-03-20T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:09:08 crc kubenswrapper[4867]: E0320 00:09:08.507706 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.513445 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.513512 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.513527 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.513543 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:09:08 crc kubenswrapper[4867]: I0320 00:09:08.513556 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:09:08Z","lastTransitionTime":"2026-03-20T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:09:08 crc kubenswrapper[4867]: E0320 00:09:08.530816 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:08 crc kubenswrapper[4867]: E0320 00:09:08.530931 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 00:09:09 crc kubenswrapper[4867]: I0320 00:09:09.421254 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:09 crc kubenswrapper[4867]: I0320 00:09:09.421368 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:09 crc kubenswrapper[4867]: E0320 00:09:09.421425 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:09 crc kubenswrapper[4867]: E0320 00:09:09.421602 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:10 crc kubenswrapper[4867]: I0320 00:09:10.420769 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:10 crc kubenswrapper[4867]: I0320 00:09:10.420858 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:10 crc kubenswrapper[4867]: E0320 00:09:10.421315 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:10 crc kubenswrapper[4867]: E0320 00:09:10.421533 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:11 crc kubenswrapper[4867]: I0320 00:09:11.421076 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:11 crc kubenswrapper[4867]: E0320 00:09:11.421266 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:11 crc kubenswrapper[4867]: I0320 00:09:11.421104 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:11 crc kubenswrapper[4867]: E0320 00:09:11.421661 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:11 crc kubenswrapper[4867]: E0320 00:09:11.494680 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 00:09:12 crc kubenswrapper[4867]: I0320 00:09:12.420885 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:12 crc kubenswrapper[4867]: E0320 00:09:12.421057 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:12 crc kubenswrapper[4867]: I0320 00:09:12.421228 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:12 crc kubenswrapper[4867]: E0320 00:09:12.421449 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:13 crc kubenswrapper[4867]: I0320 00:09:13.420836 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:13 crc kubenswrapper[4867]: I0320 00:09:13.421080 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:13 crc kubenswrapper[4867]: E0320 00:09:13.421356 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:13 crc kubenswrapper[4867]: E0320 00:09:13.421628 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:14 crc kubenswrapper[4867]: I0320 00:09:14.421112 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:14 crc kubenswrapper[4867]: I0320 00:09:14.421170 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:14 crc kubenswrapper[4867]: E0320 00:09:14.421295 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:14 crc kubenswrapper[4867]: E0320 00:09:14.421422 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:15 crc kubenswrapper[4867]: I0320 00:09:15.421453 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:15 crc kubenswrapper[4867]: E0320 00:09:15.421723 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:15 crc kubenswrapper[4867]: I0320 00:09:15.421534 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:15 crc kubenswrapper[4867]: E0320 00:09:15.421844 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.421092 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.421220 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:16 crc kubenswrapper[4867]: E0320 00:09:16.421335 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:16 crc kubenswrapper[4867]: E0320 00:09:16.421631 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.443059 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.462762 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.483668 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:16 crc kubenswrapper[4867]: E0320 00:09:16.495680 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.507235 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.524052 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.534417 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.548301 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5250e888-f67f-4ae9-8756-378e6308259a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e979da43d57b624616e0ebca87477cf565c87239ad02d1f0880a86c79bca523b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76e241ad3f91b9521298faebb9a60d29fd246e4589999ab7ab5395a3ef13c15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5b9624aaef5b4563371e16f176d45ac03c847e262f8a4775493a3e69313796\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.566342 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.581838 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c574405e-7ef0-45c8-9d24-7f10ea314d86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2409770964a1b705e256edadb4d2dd9515aee220e3fd6e01d066b5fed2c5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fbafffb1e4506f28b3b8343e36bff5d861957c94a7cbd766f3a63969f5d3b54\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:06:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 00:06:28.492141 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 00:06:28.494944 1 observer_polling.go:159] Starting file observer\\\\nI0320 00:06:28.539668 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 00:06:28.544370 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 00:06:54.762267 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 00:06:54.762395 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67b235ca7dadeca35706eb2eb341dc56002a0b2bd99586116914ef6d6e3ee7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58b6faf3a88e38a3ea415a8e7104b72a12234d5ed92a7ff9dbd7ae0067d697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45cb0851df8f9363d5a2c9cc89573235d31c0a9f0773a8531255cfb8381ef24f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.596378 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ac4d4ff5d940e0e8e7fb098074879aa2d41ab5b5b0e51d81d50801b1ff8b3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:09:03Z\\\",\\\"message\\\":\\\"2026-03-20T00:08:17+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16dc134d-e38a-4b8e-8261-c57fab0b7630\\\\n2026-03-20T00:08:17+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16dc134d-e38a-4b8e-8261-c57fab0b7630 to /host/opt/cni/bin/\\\\n2026-03-20T00:08:18Z [verbose] multus-daemon started\\\\n2026-03-20T00:08:18Z [verbose] Readiness Indicator file check\\\\n2026-03-20T00:09:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:09:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.610388 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.628310 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.641738 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.663453 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:09:05Z\\\",\\\"message\\\":\\\"0:09:05.276347 7426 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 00:09:05.275979 7426 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:09:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.672389 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6186c95-4fd8-4156-ab1b-a66356d62628\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe3cdc0404a20b6a3a2483c36b65adabadcafd16f543494fabaee167f674d23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71047bdfc9aeda57b4816fead48322cf357e88d97a9ee4303b936bebaad38026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71047bdfc9aeda57b4816fead48322cf357e88d97a9ee4303b936bebaad38026\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.689700 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.702081 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.718811 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:16 crc kubenswrapper[4867]: I0320 00:09:16.734271 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:17 crc kubenswrapper[4867]: I0320 00:09:17.421167 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:17 crc kubenswrapper[4867]: I0320 00:09:17.421355 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:17 crc kubenswrapper[4867]: E0320 00:09:17.421534 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:17 crc kubenswrapper[4867]: E0320 00:09:17.421928 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.421549 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.421558 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:18 crc kubenswrapper[4867]: E0320 00:09:18.421747 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:18 crc kubenswrapper[4867]: E0320 00:09:18.421926 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.693173 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.693419 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.693433 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.693452 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.693466 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:09:18Z","lastTransitionTime":"2026-03-20T00:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:09:18 crc kubenswrapper[4867]: E0320 00:09:18.712521 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.718190 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.718276 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.718299 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.718332 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.718355 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:09:18Z","lastTransitionTime":"2026-03-20T00:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:09:18 crc kubenswrapper[4867]: E0320 00:09:18.741004 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.745921 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.746024 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.746038 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.746059 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.746072 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:09:18Z","lastTransitionTime":"2026-03-20T00:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:09:18 crc kubenswrapper[4867]: E0320 00:09:18.763440 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.768044 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.768121 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.768143 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.768174 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.768198 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:09:18Z","lastTransitionTime":"2026-03-20T00:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:09:18 crc kubenswrapper[4867]: E0320 00:09:18.788990 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.797042 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.797110 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.797132 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.797161 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:09:18 crc kubenswrapper[4867]: I0320 00:09:18.797182 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:09:18Z","lastTransitionTime":"2026-03-20T00:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:09:18 crc kubenswrapper[4867]: E0320 00:09:18.818639 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:18Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:18 crc kubenswrapper[4867]: E0320 00:09:18.818865 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 00:09:19 crc kubenswrapper[4867]: I0320 00:09:19.421032 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:19 crc kubenswrapper[4867]: I0320 00:09:19.421092 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:19 crc kubenswrapper[4867]: E0320 00:09:19.421348 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:19 crc kubenswrapper[4867]: E0320 00:09:19.421452 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:19 crc kubenswrapper[4867]: I0320 00:09:19.421622 4867 scope.go:117] "RemoveContainer" containerID="f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94" Mar 20 00:09:19 crc kubenswrapper[4867]: E0320 00:09:19.421755 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" Mar 20 00:09:20 crc kubenswrapper[4867]: I0320 00:09:20.421090 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:20 crc kubenswrapper[4867]: I0320 00:09:20.421127 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:20 crc kubenswrapper[4867]: E0320 00:09:20.421364 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:20 crc kubenswrapper[4867]: E0320 00:09:20.421535 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:21 crc kubenswrapper[4867]: I0320 00:09:21.421439 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:21 crc kubenswrapper[4867]: I0320 00:09:21.421573 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:21 crc kubenswrapper[4867]: E0320 00:09:21.421677 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:21 crc kubenswrapper[4867]: E0320 00:09:21.421798 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:21 crc kubenswrapper[4867]: E0320 00:09:21.496970 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 00:09:22 crc kubenswrapper[4867]: I0320 00:09:22.421421 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:22 crc kubenswrapper[4867]: I0320 00:09:22.421579 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:22 crc kubenswrapper[4867]: E0320 00:09:22.421659 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:22 crc kubenswrapper[4867]: E0320 00:09:22.421789 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:23 crc kubenswrapper[4867]: I0320 00:09:23.420711 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:23 crc kubenswrapper[4867]: E0320 00:09:23.420910 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:23 crc kubenswrapper[4867]: I0320 00:09:23.420739 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:23 crc kubenswrapper[4867]: E0320 00:09:23.421231 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:24 crc kubenswrapper[4867]: I0320 00:09:24.420764 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:24 crc kubenswrapper[4867]: I0320 00:09:24.420806 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:24 crc kubenswrapper[4867]: E0320 00:09:24.420938 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:24 crc kubenswrapper[4867]: E0320 00:09:24.421035 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:25 crc kubenswrapper[4867]: I0320 00:09:25.420515 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:25 crc kubenswrapper[4867]: I0320 00:09:25.420571 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:25 crc kubenswrapper[4867]: E0320 00:09:25.420700 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:25 crc kubenswrapper[4867]: E0320 00:09:25.420912 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.421303 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:26 crc kubenswrapper[4867]: E0320 00:09:26.421470 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.421640 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:26 crc kubenswrapper[4867]: E0320 00:09:26.421684 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.437106 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.448879 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.469436 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c16597ca-4e52-469a-b6c2-cb2c0d0f07cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48de72e5e969efb6fcf8e39cb8689a124c079f9e1eb7ac944d1a0ef141c028df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0b7d9649ba34d217c308d1fb3cf9a20a07f0a6cc4cc58c8b3af5e22e525e84b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a760911d36f694184b228962946eb51c8ce93a3f2ae258fc784fb2b7f1ae7e95\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17c925a1780a87b25613f1dd32e11d7939afdf020599789c98417f08133819a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941c7f676e561504893e362200ddd6abce4f83296fe9845bbf9f7f9b68372bfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e967d173ea58171f4dc9eb1e0d44b30615f960db99610e3e50e0218c9f71ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c58f3b4dd3d0f4c3239aa5d83f03fee15266f7333b94461cdcad8714241864f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfc6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.483725 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zgbkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02c4bd4e-640d-48b8-8e73-3aead59105b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21d2308b4cbc77b33e3aab1090f7884fbbaddf9baa96b01561e46f65a3343899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ff4ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zgbkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:26 crc kubenswrapper[4867]: E0320 00:09:26.498196 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.517461 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1af2033-700e-4f63-939d-b7132a1e5b5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:09:05Z\\\",\\\"message\\\":\\\"0:09:05.276347 7426 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 00:09:05.275979 7426 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:09:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-68jdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5zkft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.535329 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6186c95-4fd8-4156-ab1b-a66356d62628\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe3cdc0404a20b6a3a2483c36b65adabadcafd16f543494fabaee167f674d23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71047bdfc9aeda57b4816fead48322cf357e88d97a9ee4303b936bebaad38026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71047bdfc9aeda57b4816fead48322cf357e88d97a9ee4303b936bebaad38026\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.556122 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533c9afac6df6229a3f4faaeff394681bbaf53ccff8d93fc424e012031fa93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.571904 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.590619 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89e9fa2f-fe9b-4511-92a6-d015d83f656f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:07:34Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 00:07:34.084184 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 00:07:34.084650 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 00:07:34.085724 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2150613574/tls.crt::/tmp/serving-cert-2150613574/tls.key\\\\\\\"\\\\nI0320 00:07:34.456204 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 00:07:34.458522 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 00:07:34.458563 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 00:07:34.458598 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 00:07:34.458613 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 00:07:34.463205 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 00:07:34.463223 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 00:07:34.463221 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 00:07:34.463228 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 00:07:34.463248 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 00:07:34.463255 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 00:07:34.463271 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 00:07:34.463276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 00:07:34.464745 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.624606 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccbdd20e-480e-4aec-9bed-6042190713e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0752f15b41f3399ee9f5ee7dc8c0fe7e29f01566ef3a3d6aa6cce88f9631ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d4ade51bfde5dc6b66b07aed5c5537cc950d60578251f07f53f9097cc83fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc08a327629e634a114df1908699de50fc34a0be127e284e718c7e38f04f705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17bf9eb6d075f25b52e5bb58d58db9119169b9c67b01b4c678c19e9122016e87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://094297f1edd7e2b11d15519b5ed9d5d9d7cdc93dfdb6cf9f43e5c70f814c4b4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa196a9a525677ac4183b56e9410a3e3642a410e5bad3ec5568a5db5900cd551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9a6761ec1e1068bd06dcf25905620c9d5ed134f30ac25a9cf4522c419190be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ab1dba4fb43b11477d9f8bcf27ad5777ba1ac00c78def4de62dcfe3e27f5765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.644564 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://248e79c251a92756344dfc2d07108125a750478b6e2f89be56af7da4f103496a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9ece665c08d84ecfe6ab9bab5efc3a86539d02a943c639432afce478650089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.659892 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2xwxb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6d1df48-b16a-4691-82bf-68d8cce94a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e07fd626b99cbaea4a0b312d0b6bd6bfc6cf16a31cdf87582166bfb973187ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6kk54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2xwxb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.676744 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b58a9381c610b509c3e824daff330fa44313ab7838aa5f429c58f5b4c1dc6a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.694398 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4d1c044-4ed7-44b6-9cd0-e52371e17e40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96399436c3bfddb78dc387498cfc6d5a29845d428dcbd600d0cd6689e8b98a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b82b7b05c5506bbe1e397bf078f2132926b97b17c291f568581fa1d8a16eace2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xl5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tfl97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.710953 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00eacbd3-d921-414b-8b8d-c4298bdd5a28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67faa77bc13fdc46c2d03b462681bc4e5e5baf5705c33e2e2edb7eefd319d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bnpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v9vbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.727731 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e040dc6-20c6-4d82-b719-bf25fa43db67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xn55k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rkq8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.745956 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5250e888-f67f-4ae9-8756-378e6308259a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e979da43d57b624616e0ebca87477cf565c87239ad02d1f0880a86c79bca523b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76e241ad3f91b9521298faebb9a60d29fd246e4589999ab7ab5395a3ef13c15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5b9624aaef5b4563371e16f176d45ac03c847e262f8a4775493a3e69313796\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd993af256c12b0636cb8eb2988e14cd31fd0e617b3c11586b07dbd91264f8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.766067 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-98n2n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97e52c03-2ca5-4cad-8459-f03029234544\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94ac4d4ff5d940e0e8e7fb098074879aa2d41ab5b5b0e51d81d50801b1ff8b3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T00:09:03Z\\\",\\\"message\\\":\\\"2026-03-20T00:08:17+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16dc134d-e38a-4b8e-8261-c57fab0b7630\\\\n2026-03-20T00:08:17+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16dc134d-e38a-4b8e-8261-c57fab0b7630 to /host/opt/cni/bin/\\\\n2026-03-20T00:08:18Z [verbose] multus-daemon started\\\\n2026-03-20T00:08:18Z [verbose] Readiness Indicator file check\\\\n2026-03-20T00:09:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:08:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:09:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvzk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-98n2n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:26 crc kubenswrapper[4867]: I0320 00:09:26.785069 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c574405e-7ef0-45c8-9d24-7f10ea314d86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2409770964a1b705e256edadb4d2dd9515aee220e3fd6e01d066b5fed2c5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fbafffb1e4506f28b3b8343e36bff5d861957c94a7cbd766f3a63969f5d3b54\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T00:06:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 00:06:28.492141 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 00:06:28.494944 1 observer_polling.go:159] Starting file observer\\\\nI0320 00:06:28.539668 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 00:06:28.544370 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 00:06:54.762267 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 00:06:54.762395 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67b235ca7dadeca35706eb2eb341dc56002a0b2bd99586116914ef6d6e3ee7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58b6faf3a88e38a3ea415a8e7104b72a12234d5ed92a7ff9dbd7ae0067d697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45cb0851df8f9363d5a2c9cc89573235d31c0a9f0773a8531255cfb8381ef24f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:27 crc kubenswrapper[4867]: I0320 00:09:27.421088 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:27 crc kubenswrapper[4867]: I0320 00:09:27.421125 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:27 crc kubenswrapper[4867]: E0320 00:09:27.421338 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:27 crc kubenswrapper[4867]: E0320 00:09:27.421551 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.421581 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:28 crc kubenswrapper[4867]: E0320 00:09:28.421849 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.421609 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:28 crc kubenswrapper[4867]: E0320 00:09:28.422287 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.872003 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.872416 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.872606 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.872744 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.872860 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:09:28Z","lastTransitionTime":"2026-03-20T00:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:09:28 crc kubenswrapper[4867]: E0320 00:09:28.892585 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.897690 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.897911 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.898062 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.898222 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.898372 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:09:28Z","lastTransitionTime":"2026-03-20T00:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:09:28 crc kubenswrapper[4867]: E0320 00:09:28.918534 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.924247 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.924309 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.924328 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.924355 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.924374 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:09:28Z","lastTransitionTime":"2026-03-20T00:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:09:28 crc kubenswrapper[4867]: E0320 00:09:28.945658 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.950127 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.950182 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.950202 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.950226 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.950243 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:09:28Z","lastTransitionTime":"2026-03-20T00:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:09:28 crc kubenswrapper[4867]: E0320 00:09:28.970238 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.976585 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.977050 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.977259 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.977461 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:09:28 crc kubenswrapper[4867]: I0320 00:09:28.977712 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:09:28Z","lastTransitionTime":"2026-03-20T00:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:09:28 crc kubenswrapper[4867]: E0320 00:09:28.999652 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T00:09:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4229a945-e3d9-463c-a5d7-4185d2687bef\\\",\\\"systemUUID\\\":\\\"d62f574b-dd16-438d-b253-459ad966267c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T00:09:28Z is after 2025-08-24T17:21:41Z" Mar 20 00:09:29 crc kubenswrapper[4867]: E0320 00:09:29.000125 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 00:09:29 crc kubenswrapper[4867]: I0320 00:09:29.421270 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:29 crc kubenswrapper[4867]: I0320 00:09:29.421316 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:29 crc kubenswrapper[4867]: E0320 00:09:29.421788 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:29 crc kubenswrapper[4867]: E0320 00:09:29.422154 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:30 crc kubenswrapper[4867]: I0320 00:09:30.420798 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:30 crc kubenswrapper[4867]: E0320 00:09:30.421059 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:30 crc kubenswrapper[4867]: I0320 00:09:30.421114 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:30 crc kubenswrapper[4867]: E0320 00:09:30.421694 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:30 crc kubenswrapper[4867]: I0320 00:09:30.422213 4867 scope.go:117] "RemoveContainer" containerID="f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94" Mar 20 00:09:30 crc kubenswrapper[4867]: E0320 00:09:30.422527 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5zkft_openshift-ovn-kubernetes(f1af2033-700e-4f63-939d-b7132a1e5b5f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" Mar 20 00:09:31 crc kubenswrapper[4867]: I0320 00:09:31.421291 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:31 crc kubenswrapper[4867]: E0320 00:09:31.421462 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:31 crc kubenswrapper[4867]: I0320 00:09:31.421291 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:31 crc kubenswrapper[4867]: E0320 00:09:31.421898 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:31 crc kubenswrapper[4867]: E0320 00:09:31.500089 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 00:09:32 crc kubenswrapper[4867]: I0320 00:09:32.421108 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:32 crc kubenswrapper[4867]: I0320 00:09:32.421139 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:32 crc kubenswrapper[4867]: E0320 00:09:32.421330 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:32 crc kubenswrapper[4867]: E0320 00:09:32.421352 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:33 crc kubenswrapper[4867]: I0320 00:09:33.421209 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:33 crc kubenswrapper[4867]: E0320 00:09:33.421350 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:33 crc kubenswrapper[4867]: I0320 00:09:33.421348 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:33 crc kubenswrapper[4867]: E0320 00:09:33.421748 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:34 crc kubenswrapper[4867]: I0320 00:09:34.420802 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:34 crc kubenswrapper[4867]: I0320 00:09:34.420843 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:34 crc kubenswrapper[4867]: E0320 00:09:34.421163 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:34 crc kubenswrapper[4867]: E0320 00:09:34.421267 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:35 crc kubenswrapper[4867]: I0320 00:09:35.421059 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:35 crc kubenswrapper[4867]: I0320 00:09:35.421129 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:35 crc kubenswrapper[4867]: E0320 00:09:35.421257 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:35 crc kubenswrapper[4867]: E0320 00:09:35.421389 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:36 crc kubenswrapper[4867]: I0320 00:09:36.420592 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:36 crc kubenswrapper[4867]: I0320 00:09:36.420719 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:36 crc kubenswrapper[4867]: E0320 00:09:36.420921 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:36 crc kubenswrapper[4867]: E0320 00:09:36.421039 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:36 crc kubenswrapper[4867]: I0320 00:09:36.477598 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=104.477567942 podStartE2EDuration="1m44.477567942s" podCreationTimestamp="2026-03-20 00:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:09:36.477385188 +0000 UTC m=+190.703922735" watchObservedRunningTime="2026-03-20 00:09:36.477567942 +0000 UTC m=+190.704105499" Mar 20 00:09:36 crc kubenswrapper[4867]: E0320 00:09:36.500830 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 00:09:36 crc kubenswrapper[4867]: I0320 00:09:36.548689 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2xwxb" podStartSLOduration=140.548662896 podStartE2EDuration="2m20.548662896s" podCreationTimestamp="2026-03-20 00:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:09:36.53454366 +0000 UTC m=+190.761081217" watchObservedRunningTime="2026-03-20 00:09:36.548662896 +0000 UTC m=+190.775200433" Mar 20 00:09:36 crc kubenswrapper[4867]: I0320 00:09:36.565550 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tfl97" podStartSLOduration=139.565531494 podStartE2EDuration="2m19.565531494s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:09:36.565344119 +0000 UTC m=+190.791881656" watchObservedRunningTime="2026-03-20 00:09:36.565531494 +0000 UTC m=+190.792069021" Mar 20 00:09:36 crc kubenswrapper[4867]: I0320 00:09:36.588176 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podStartSLOduration=139.58815693 podStartE2EDuration="2m19.58815693s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:09:36.57694517 +0000 UTC m=+190.803482697" watchObservedRunningTime="2026-03-20 00:09:36.58815693 +0000 UTC m=+190.814694457" Mar 20 00:09:36 crc kubenswrapper[4867]: I0320 00:09:36.633899 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=44.633878956 podStartE2EDuration="44.633878956s" podCreationTimestamp="2026-03-20 00:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:09:36.602973265 +0000 UTC m=+190.829510802" watchObservedRunningTime="2026-03-20 00:09:36.633878956 +0000 UTC m=+190.860416483" Mar 20 00:09:36 crc kubenswrapper[4867]: I0320 00:09:36.652675 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=108.652651913 podStartE2EDuration="1m48.652651913s" podCreationTimestamp="2026-03-20 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:09:36.634269186 +0000 UTC m=+190.860806723" watchObservedRunningTime="2026-03-20 00:09:36.652651913 +0000 UTC m=+190.879189450" Mar 20 00:09:36 crc kubenswrapper[4867]: I0320 00:09:36.674270 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=55.674251783 podStartE2EDuration="55.674251783s" podCreationTimestamp="2026-03-20 00:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:09:36.652846368 +0000 UTC m=+190.879383905" watchObservedRunningTime="2026-03-20 00:09:36.674251783 +0000 UTC m=+190.900789300" Mar 20 00:09:36 crc kubenswrapper[4867]: I0320 00:09:36.697148 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-98n2n" podStartSLOduration=139.697128427 podStartE2EDuration="2m19.697128427s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:09:36.673967356 +0000 UTC m=+190.900504903" watchObservedRunningTime="2026-03-20 00:09:36.697128427 +0000 UTC m=+190.923665954" Mar 20 00:09:36 crc kubenswrapper[4867]: I0320 00:09:36.738560 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zgbkt" podStartSLOduration=140.738544001 podStartE2EDuration="2m20.738544001s" podCreationTimestamp="2026-03-20 00:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:09:36.738167751 +0000 UTC m=+190.964705278" watchObservedRunningTime="2026-03-20 00:09:36.738544001 +0000 UTC m=+190.965081518" Mar 20 00:09:36 crc kubenswrapper[4867]: I0320 00:09:36.738710 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dfc6c" podStartSLOduration=139.738706025 podStartE2EDuration="2m19.738706025s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:09:36.726634562 +0000 UTC m=+190.953172089" watchObservedRunningTime="2026-03-20 00:09:36.738706025 +0000 UTC m=+190.965243542" Mar 20 00:09:36 crc kubenswrapper[4867]: I0320 00:09:36.772880 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=38.772863361 podStartE2EDuration="38.772863361s" podCreationTimestamp="2026-03-20 00:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:09:36.772186983 +0000 UTC m=+190.998724490" watchObservedRunningTime="2026-03-20 00:09:36.772863361 +0000 UTC m=+190.999400878" Mar 20 00:09:37 crc kubenswrapper[4867]: I0320 00:09:37.421372 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:37 crc kubenswrapper[4867]: I0320 00:09:37.421468 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:37 crc kubenswrapper[4867]: E0320 00:09:37.421637 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:37 crc kubenswrapper[4867]: E0320 00:09:37.421813 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:38 crc kubenswrapper[4867]: I0320 00:09:38.421379 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:38 crc kubenswrapper[4867]: I0320 00:09:38.421448 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:38 crc kubenswrapper[4867]: E0320 00:09:38.421599 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:38 crc kubenswrapper[4867]: E0320 00:09:38.421731 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.140987 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.141034 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.141044 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.141061 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.141072 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T00:09:39Z","lastTransitionTime":"2026-03-20T00:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.421122 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.421147 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:39 crc kubenswrapper[4867]: E0320 00:09:39.421240 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:39 crc kubenswrapper[4867]: E0320 00:09:39.421318 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.535743 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz"] Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.536420 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.538413 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.538722 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.539460 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.540353 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.575099 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b831d325-2578-40ac-a423-24c0e3fb2853-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kw8vz\" (UID: \"b831d325-2578-40ac-a423-24c0e3fb2853\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.575162 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b831d325-2578-40ac-a423-24c0e3fb2853-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kw8vz\" (UID: \"b831d325-2578-40ac-a423-24c0e3fb2853\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.575265 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b831d325-2578-40ac-a423-24c0e3fb2853-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kw8vz\" (UID: \"b831d325-2578-40ac-a423-24c0e3fb2853\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.575307 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b831d325-2578-40ac-a423-24c0e3fb2853-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kw8vz\" (UID: \"b831d325-2578-40ac-a423-24c0e3fb2853\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.575328 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b831d325-2578-40ac-a423-24c0e3fb2853-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kw8vz\" (UID: \"b831d325-2578-40ac-a423-24c0e3fb2853\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.580873 4867 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.589676 4867 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.676380 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b831d325-2578-40ac-a423-24c0e3fb2853-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kw8vz\" (UID: \"b831d325-2578-40ac-a423-24c0e3fb2853\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.676459 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b831d325-2578-40ac-a423-24c0e3fb2853-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kw8vz\" (UID: \"b831d325-2578-40ac-a423-24c0e3fb2853\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.676540 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b831d325-2578-40ac-a423-24c0e3fb2853-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kw8vz\" (UID: \"b831d325-2578-40ac-a423-24c0e3fb2853\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.676669 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b831d325-2578-40ac-a423-24c0e3fb2853-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kw8vz\" (UID: \"b831d325-2578-40ac-a423-24c0e3fb2853\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.676714 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b831d325-2578-40ac-a423-24c0e3fb2853-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kw8vz\" (UID: \"b831d325-2578-40ac-a423-24c0e3fb2853\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.676842 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b831d325-2578-40ac-a423-24c0e3fb2853-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kw8vz\" (UID: \"b831d325-2578-40ac-a423-24c0e3fb2853\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.676849 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b831d325-2578-40ac-a423-24c0e3fb2853-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kw8vz\" (UID: \"b831d325-2578-40ac-a423-24c0e3fb2853\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.678556 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b831d325-2578-40ac-a423-24c0e3fb2853-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kw8vz\" (UID: \"b831d325-2578-40ac-a423-24c0e3fb2853\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.682068 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b831d325-2578-40ac-a423-24c0e3fb2853-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kw8vz\" (UID: \"b831d325-2578-40ac-a423-24c0e3fb2853\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.700122 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b831d325-2578-40ac-a423-24c0e3fb2853-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kw8vz\" (UID: \"b831d325-2578-40ac-a423-24c0e3fb2853\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" Mar 20 00:09:39 crc kubenswrapper[4867]: I0320 00:09:39.861370 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" Mar 20 00:09:40 crc kubenswrapper[4867]: I0320 00:09:40.421563 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:40 crc kubenswrapper[4867]: I0320 00:09:40.421725 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:40 crc kubenswrapper[4867]: E0320 00:09:40.421784 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:40 crc kubenswrapper[4867]: E0320 00:09:40.422046 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:40 crc kubenswrapper[4867]: I0320 00:09:40.818096 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" event={"ID":"b831d325-2578-40ac-a423-24c0e3fb2853","Type":"ContainerStarted","Data":"7c2537b782463f32ca008c869bf371c8eb84f77902c8eae7a0d3802923f1203a"} Mar 20 00:09:40 crc kubenswrapper[4867]: I0320 00:09:40.818140 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" event={"ID":"b831d325-2578-40ac-a423-24c0e3fb2853","Type":"ContainerStarted","Data":"1743009274d66d912b31b1814ff8d7414937f45abc9e23b07eafb518d03d8583"} Mar 20 00:09:40 crc kubenswrapper[4867]: I0320 00:09:40.832465 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kw8vz" podStartSLOduration=144.832447075 podStartE2EDuration="2m24.832447075s" podCreationTimestamp="2026-03-20 00:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:09:40.831744567 +0000 UTC m=+195.058282094" watchObservedRunningTime="2026-03-20 00:09:40.832447075 +0000 UTC m=+195.058984612" Mar 20 00:09:41 crc kubenswrapper[4867]: I0320 00:09:41.421651 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:41 crc kubenswrapper[4867]: I0320 00:09:41.421651 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:41 crc kubenswrapper[4867]: E0320 00:09:41.421859 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:41 crc kubenswrapper[4867]: E0320 00:09:41.422008 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:41 crc kubenswrapper[4867]: E0320 00:09:41.501601 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 00:09:42 crc kubenswrapper[4867]: I0320 00:09:42.421453 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:42 crc kubenswrapper[4867]: I0320 00:09:42.421543 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:42 crc kubenswrapper[4867]: E0320 00:09:42.421713 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:42 crc kubenswrapper[4867]: E0320 00:09:42.421887 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:43 crc kubenswrapper[4867]: I0320 00:09:43.420557 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:43 crc kubenswrapper[4867]: I0320 00:09:43.420619 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:43 crc kubenswrapper[4867]: E0320 00:09:43.420874 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:43 crc kubenswrapper[4867]: E0320 00:09:43.421023 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:44 crc kubenswrapper[4867]: I0320 00:09:44.421002 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:44 crc kubenswrapper[4867]: E0320 00:09:44.421230 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:44 crc kubenswrapper[4867]: I0320 00:09:44.421306 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:44 crc kubenswrapper[4867]: E0320 00:09:44.421747 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:45 crc kubenswrapper[4867]: I0320 00:09:45.421643 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:45 crc kubenswrapper[4867]: I0320 00:09:45.421670 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:45 crc kubenswrapper[4867]: E0320 00:09:45.421847 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:45 crc kubenswrapper[4867]: E0320 00:09:45.422476 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:45 crc kubenswrapper[4867]: I0320 00:09:45.422967 4867 scope.go:117] "RemoveContainer" containerID="f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94" Mar 20 00:09:45 crc kubenswrapper[4867]: I0320 00:09:45.836744 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovnkube-controller/3.log" Mar 20 00:09:45 crc kubenswrapper[4867]: I0320 00:09:45.838902 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerStarted","Data":"68269af659165297c90238e7a18f8776fb268e39ddd21dbbae7f391687bb05ee"} Mar 20 00:09:45 crc kubenswrapper[4867]: I0320 00:09:45.839332 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:09:45 crc kubenswrapper[4867]: I0320 00:09:45.881341 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podStartSLOduration=148.881316998 podStartE2EDuration="2m28.881316998s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:09:45.880118398 +0000 UTC m=+200.106656025" watchObservedRunningTime="2026-03-20 00:09:45.881316998 +0000 UTC m=+200.107854555" Mar 20 00:09:46 crc kubenswrapper[4867]: I0320 00:09:46.421674 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:46 crc kubenswrapper[4867]: E0320 00:09:46.423435 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:46 crc kubenswrapper[4867]: I0320 00:09:46.423636 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:46 crc kubenswrapper[4867]: E0320 00:09:46.423815 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:46 crc kubenswrapper[4867]: I0320 00:09:46.488422 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rkq8h"] Mar 20 00:09:46 crc kubenswrapper[4867]: E0320 00:09:46.502750 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 00:09:46 crc kubenswrapper[4867]: I0320 00:09:46.842530 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:46 crc kubenswrapper[4867]: E0320 00:09:46.843574 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:47 crc kubenswrapper[4867]: I0320 00:09:47.420790 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:47 crc kubenswrapper[4867]: E0320 00:09:47.420949 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:47 crc kubenswrapper[4867]: I0320 00:09:47.420790 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:47 crc kubenswrapper[4867]: E0320 00:09:47.421327 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:48 crc kubenswrapper[4867]: I0320 00:09:48.421241 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:48 crc kubenswrapper[4867]: E0320 00:09:48.421377 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:48 crc kubenswrapper[4867]: I0320 00:09:48.421241 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:48 crc kubenswrapper[4867]: E0320 00:09:48.421564 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:49 crc kubenswrapper[4867]: I0320 00:09:49.421014 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:49 crc kubenswrapper[4867]: I0320 00:09:49.421093 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:49 crc kubenswrapper[4867]: E0320 00:09:49.421276 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:49 crc kubenswrapper[4867]: E0320 00:09:49.421546 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:50 crc kubenswrapper[4867]: I0320 00:09:50.421172 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:50 crc kubenswrapper[4867]: I0320 00:09:50.421330 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:50 crc kubenswrapper[4867]: E0320 00:09:50.421555 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 00:09:50 crc kubenswrapper[4867]: E0320 00:09:50.423241 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rkq8h" podUID="0e040dc6-20c6-4d82-b719-bf25fa43db67" Mar 20 00:09:51 crc kubenswrapper[4867]: I0320 00:09:51.421327 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:51 crc kubenswrapper[4867]: I0320 00:09:51.421328 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:51 crc kubenswrapper[4867]: E0320 00:09:51.421534 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 00:09:51 crc kubenswrapper[4867]: E0320 00:09:51.421670 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 00:09:52 crc kubenswrapper[4867]: I0320 00:09:52.420616 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:52 crc kubenswrapper[4867]: I0320 00:09:52.420700 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:52 crc kubenswrapper[4867]: I0320 00:09:52.423008 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 00:09:52 crc kubenswrapper[4867]: I0320 00:09:52.423876 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 00:09:52 crc kubenswrapper[4867]: I0320 00:09:52.424157 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 00:09:52 crc kubenswrapper[4867]: I0320 00:09:52.425693 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 00:09:53 crc kubenswrapper[4867]: I0320 00:09:53.420665 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:53 crc kubenswrapper[4867]: I0320 00:09:53.420675 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:53 crc kubenswrapper[4867]: I0320 00:09:53.424166 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 00:09:53 crc kubenswrapper[4867]: I0320 00:09:53.427904 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 00:09:56 crc kubenswrapper[4867]: I0320 00:09:56.361785 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:09:56 crc kubenswrapper[4867]: E0320 00:09:56.361963 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:11:58.361939176 +0000 UTC m=+332.588476703 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:09:56 crc kubenswrapper[4867]: I0320 00:09:56.363030 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:56 crc kubenswrapper[4867]: I0320 00:09:56.363118 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:56 crc kubenswrapper[4867]: I0320 00:09:56.363172 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:56 crc kubenswrapper[4867]: I0320 00:09:56.364649 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:56 crc kubenswrapper[4867]: I0320 00:09:56.371518 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:56 crc kubenswrapper[4867]: I0320 00:09:56.371973 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:56 crc kubenswrapper[4867]: I0320 00:09:56.455866 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 00:09:56 crc kubenswrapper[4867]: I0320 00:09:56.464257 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs\") pod \"network-metrics-daemon-rkq8h\" (UID: \"0e040dc6-20c6-4d82-b719-bf25fa43db67\") " pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:56 crc kubenswrapper[4867]: I0320 00:09:56.464298 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:56 crc kubenswrapper[4867]: I0320 00:09:56.470099 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:56 crc kubenswrapper[4867]: I0320 00:09:56.470116 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e040dc6-20c6-4d82-b719-bf25fa43db67-metrics-certs\") pod \"network-metrics-daemon-rkq8h\" (UID: \"0e040dc6-20c6-4d82-b719-bf25fa43db67\") " pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:56 crc kubenswrapper[4867]: I0320 00:09:56.645189 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:56 crc kubenswrapper[4867]: I0320 00:09:56.660076 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rkq8h" Mar 20 00:09:56 crc kubenswrapper[4867]: W0320 00:09:56.706858 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-dfe12140cb9ee1c9dfe968ab4084a7767209d19b51bdfb75bef6e941dadcddaf WatchSource:0}: Error finding container dfe12140cb9ee1c9dfe968ab4084a7767209d19b51bdfb75bef6e941dadcddaf: Status 404 returned error can't find the container with id dfe12140cb9ee1c9dfe968ab4084a7767209d19b51bdfb75bef6e941dadcddaf Mar 20 00:09:56 crc kubenswrapper[4867]: I0320 00:09:56.742632 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 00:09:56 crc kubenswrapper[4867]: I0320 00:09:56.891724 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rkq8h"] Mar 20 00:09:56 crc kubenswrapper[4867]: I0320 00:09:56.892205 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"dfe12140cb9ee1c9dfe968ab4084a7767209d19b51bdfb75bef6e941dadcddaf"} Mar 20 00:09:56 crc kubenswrapper[4867]: W0320 00:09:56.896623 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-12fbf5abba9c63aa4e446a95a8515f361062965faea79543cf698aa6ace70d12 WatchSource:0}: Error finding container 12fbf5abba9c63aa4e446a95a8515f361062965faea79543cf698aa6ace70d12: Status 404 returned error can't find the container with id 12fbf5abba9c63aa4e446a95a8515f361062965faea79543cf698aa6ace70d12 Mar 20 00:09:56 crc kubenswrapper[4867]: W0320 00:09:56.898543 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e040dc6_20c6_4d82_b719_bf25fa43db67.slice/crio-3e8b241c57b68ed0a227dc5836adc5a4f7dacb8b91b1f121c17d5384b30e3e2c WatchSource:0}: Error finding container 3e8b241c57b68ed0a227dc5836adc5a4f7dacb8b91b1f121c17d5384b30e3e2c: Status 404 returned error can't find the container with id 3e8b241c57b68ed0a227dc5836adc5a4f7dacb8b91b1f121c17d5384b30e3e2c Mar 20 00:09:56 crc kubenswrapper[4867]: W0320 00:09:56.957677 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-fb89fe1569e53010a4404a4ebfcab9963a6c62d2f190a1f45a98b5e73d2dce07 WatchSource:0}: Error finding container fb89fe1569e53010a4404a4ebfcab9963a6c62d2f190a1f45a98b5e73d2dce07: Status 404 returned error can't find the container with id fb89fe1569e53010a4404a4ebfcab9963a6c62d2f190a1f45a98b5e73d2dce07 Mar 20 00:09:57 crc kubenswrapper[4867]: I0320 00:09:57.898696 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" event={"ID":"0e040dc6-20c6-4d82-b719-bf25fa43db67","Type":"ContainerStarted","Data":"2f3fdda587195a1244ae2cfa863aaed5ed54a4803f599d0822a12c6552b4abfb"} Mar 20 00:09:57 crc kubenswrapper[4867]: I0320 00:09:57.898766 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" event={"ID":"0e040dc6-20c6-4d82-b719-bf25fa43db67","Type":"ContainerStarted","Data":"5595eaa1cda00eb134d51af0d675fa716cab06b98b6a5c9fefc9cb46c904e586"} Mar 20 00:09:57 crc kubenswrapper[4867]: I0320 00:09:57.898785 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rkq8h" event={"ID":"0e040dc6-20c6-4d82-b719-bf25fa43db67","Type":"ContainerStarted","Data":"3e8b241c57b68ed0a227dc5836adc5a4f7dacb8b91b1f121c17d5384b30e3e2c"} Mar 20 00:09:57 crc kubenswrapper[4867]: I0320 00:09:57.903059 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e8e54cde134f4987030e088766045c81fafea7554bafb7bb63146d723d06c027"} Mar 20 00:09:57 crc kubenswrapper[4867]: I0320 00:09:57.903172 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"12fbf5abba9c63aa4e446a95a8515f361062965faea79543cf698aa6ace70d12"} Mar 20 00:09:57 crc kubenswrapper[4867]: I0320 00:09:57.904359 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:09:57 crc kubenswrapper[4867]: I0320 00:09:57.907930 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1c38c381853d34404b517568d77b4ecef6d88d6ab1c64a9785d9e5711113e2d6"} Mar 20 00:09:57 crc kubenswrapper[4867]: I0320 00:09:57.908025 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"fb89fe1569e53010a4404a4ebfcab9963a6c62d2f190a1f45a98b5e73d2dce07"} Mar 20 00:09:57 crc kubenswrapper[4867]: I0320 00:09:57.910063 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"619b81e3ad3086a5f11af2eefd1e04461829ad5e9e91c1a02b0a2ea76e8a149a"} Mar 20 00:09:57 crc kubenswrapper[4867]: I0320 00:09:57.929970 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rkq8h" podStartSLOduration=160.929950122 podStartE2EDuration="2m40.929950122s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:09:57.92866875 +0000 UTC m=+212.155206347" watchObservedRunningTime="2026-03-20 00:09:57.929950122 +0000 UTC m=+212.156487669" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.818820 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.870963 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-n22zb"] Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.874858 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nmk6s"] Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.875750 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.878531 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.879666 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.880030 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-n22zb" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.880219 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vbfqm"] Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.881204 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.882108 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xslxh"] Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.882746 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.884930 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq"] Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.885848 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.887625 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.887738 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.889454 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.889779 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.890010 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.890455 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.895999 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.904701 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.905146 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.905599 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.905825 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.906044 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.906096 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.906284 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.906520 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.911026 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.912229 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.913853 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.914375 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.914612 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.915370 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.915942 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.916003 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.916295 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.916696 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.924661 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.927349 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.927478 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.938205 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.955690 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.955909 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.956254 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bwpfb"] Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.956848 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.956897 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.957240 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-tp4c2"] Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.957822 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tp4c2" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958063 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c75b5610-6996-4d26-b251-04176399fd0b-etcd-client\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958091 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-client-ca\") pod \"controller-manager-879f6c89f-nmk6s\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958108 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7w4q\" (UniqueName: \"kubernetes.io/projected/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-kube-api-access-r7w4q\") pod \"controller-manager-879f6c89f-nmk6s\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958127 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/393fa798-a141-4d7d-8c72-5289bd6f3b5c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xslxh\" (UID: \"393fa798-a141-4d7d-8c72-5289bd6f3b5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958146 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsr2h\" (UniqueName: \"kubernetes.io/projected/c75b5610-6996-4d26-b251-04176399fd0b-kube-api-access-tsr2h\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958160 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c75b5610-6996-4d26-b251-04176399fd0b-encryption-config\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958176 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgmt2\" (UniqueName: \"kubernetes.io/projected/fb29a322-7b58-4faf-9649-fc79f35c2e52-kube-api-access-jgmt2\") pod \"machine-approver-56656f9798-kl9jq\" (UID: \"fb29a322-7b58-4faf-9649-fc79f35c2e52\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958190 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/393fa798-a141-4d7d-8c72-5289bd6f3b5c-config\") pod \"authentication-operator-69f744f599-xslxh\" (UID: \"393fa798-a141-4d7d-8c72-5289bd6f3b5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958205 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c75b5610-6996-4d26-b251-04176399fd0b-etcd-serving-ca\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958241 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/393fa798-a141-4d7d-8c72-5289bd6f3b5c-serving-cert\") pod \"authentication-operator-69f744f599-xslxh\" (UID: \"393fa798-a141-4d7d-8c72-5289bd6f3b5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958256 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/393fa798-a141-4d7d-8c72-5289bd6f3b5c-service-ca-bundle\") pod \"authentication-operator-69f744f599-xslxh\" (UID: \"393fa798-a141-4d7d-8c72-5289bd6f3b5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958290 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6ed1b0f-b167-4dd9-9cfb-0687dad12d05-config\") pod \"machine-api-operator-5694c8668f-n22zb\" (UID: \"f6ed1b0f-b167-4dd9-9cfb-0687dad12d05\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n22zb" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958305 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb29a322-7b58-4faf-9649-fc79f35c2e52-config\") pod \"machine-approver-56656f9798-kl9jq\" (UID: \"fb29a322-7b58-4faf-9649-fc79f35c2e52\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958319 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c75b5610-6996-4d26-b251-04176399fd0b-node-pullsecrets\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958364 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fb29a322-7b58-4faf-9649-fc79f35c2e52-machine-approver-tls\") pod \"machine-approver-56656f9798-kl9jq\" (UID: \"fb29a322-7b58-4faf-9649-fc79f35c2e52\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958380 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c75b5610-6996-4d26-b251-04176399fd0b-serving-cert\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958445 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-serving-cert\") pod \"controller-manager-879f6c89f-nmk6s\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958555 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c75b5610-6996-4d26-b251-04176399fd0b-audit\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958586 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c75b5610-6996-4d26-b251-04176399fd0b-image-import-ca\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958617 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c75b5610-6996-4d26-b251-04176399fd0b-config\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958646 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c75b5610-6996-4d26-b251-04176399fd0b-audit-dir\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958668 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpd4l\" (UniqueName: \"kubernetes.io/projected/393fa798-a141-4d7d-8c72-5289bd6f3b5c-kube-api-access-lpd4l\") pod \"authentication-operator-69f744f599-xslxh\" (UID: \"393fa798-a141-4d7d-8c72-5289bd6f3b5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958728 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fb29a322-7b58-4faf-9649-fc79f35c2e52-auth-proxy-config\") pod \"machine-approver-56656f9798-kl9jq\" (UID: \"fb29a322-7b58-4faf-9649-fc79f35c2e52\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958747 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-config\") pod \"controller-manager-879f6c89f-nmk6s\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958767 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nmk6s\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958803 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f6ed1b0f-b167-4dd9-9cfb-0687dad12d05-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-n22zb\" (UID: \"f6ed1b0f-b167-4dd9-9cfb-0687dad12d05\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n22zb" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958825 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f6ed1b0f-b167-4dd9-9cfb-0687dad12d05-images\") pod \"machine-api-operator-5694c8668f-n22zb\" (UID: \"f6ed1b0f-b167-4dd9-9cfb-0687dad12d05\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n22zb" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958847 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnz7j\" (UniqueName: \"kubernetes.io/projected/f6ed1b0f-b167-4dd9-9cfb-0687dad12d05-kube-api-access-lnz7j\") pod \"machine-api-operator-5694c8668f-n22zb\" (UID: \"f6ed1b0f-b167-4dd9-9cfb-0687dad12d05\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n22zb" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.958879 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75b5610-6996-4d26-b251-04176399fd0b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.963739 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.965193 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.965374 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.965461 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.965619 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.965904 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.966103 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.969832 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.970034 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.970185 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.970476 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.971131 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.976574 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.978091 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwwgf"] Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.978570 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29566080-vq54z"] Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.978851 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.985244 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.988832 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwwgf" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.990042 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd"] Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.990455 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4wmcx"] Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.990861 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zvttc"] Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.991257 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-mzkng"] Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.991579 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2"] Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.991882 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxlq5"] Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.992225 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxlq5" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.992412 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4wmcx" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.992623 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zvttc" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.992804 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29566080-vq54z" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.992905 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.993024 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:09:59 crc kubenswrapper[4867]: I0320 00:09:59.993171 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.000875 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.000921 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.001068 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.001115 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.001155 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.001180 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.001291 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.001300 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.001399 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.001507 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.001589 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.005954 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.006167 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.006686 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7gtj4"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.019772 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.022783 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.026280 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.028195 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.028410 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.028442 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.028564 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.028664 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.028793 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.028836 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.028898 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.028951 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.029001 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.029051 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.029101 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.029146 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.029206 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.029400 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.029522 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.029611 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.029704 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.029779 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.029859 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.029940 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.030022 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.030103 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.030645 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.030881 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.033801 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.039114 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.039712 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r2ssr"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.040107 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-r2ssr" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.041412 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.041958 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.042123 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fxnfx"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.044613 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qspsc"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.044927 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s97mj"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.045029 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-fxnfx" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.046252 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qspsc" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.046468 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.046657 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.046794 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.046889 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.046908 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.047007 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.047196 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.047433 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.047769 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.047865 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s97mj" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.049144 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nmk6s"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.051142 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.051235 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.051368 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.051584 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.051745 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.051839 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.058302 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ghlsv"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.058929 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ghlsv" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059357 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/393fa798-a141-4d7d-8c72-5289bd6f3b5c-serving-cert\") pod \"authentication-operator-69f744f599-xslxh\" (UID: \"393fa798-a141-4d7d-8c72-5289bd6f3b5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059388 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/393fa798-a141-4d7d-8c72-5289bd6f3b5c-service-ca-bundle\") pod \"authentication-operator-69f744f599-xslxh\" (UID: \"393fa798-a141-4d7d-8c72-5289bd6f3b5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059408 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6ed1b0f-b167-4dd9-9cfb-0687dad12d05-config\") pod \"machine-api-operator-5694c8668f-n22zb\" (UID: \"f6ed1b0f-b167-4dd9-9cfb-0687dad12d05\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n22zb" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059425 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb29a322-7b58-4faf-9649-fc79f35c2e52-config\") pod \"machine-approver-56656f9798-kl9jq\" (UID: \"fb29a322-7b58-4faf-9649-fc79f35c2e52\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059440 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c75b5610-6996-4d26-b251-04176399fd0b-node-pullsecrets\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059454 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fb29a322-7b58-4faf-9649-fc79f35c2e52-machine-approver-tls\") pod \"machine-approver-56656f9798-kl9jq\" (UID: \"fb29a322-7b58-4faf-9649-fc79f35c2e52\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059469 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c75b5610-6996-4d26-b251-04176399fd0b-serving-cert\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059503 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-serving-cert\") pod \"controller-manager-879f6c89f-nmk6s\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059518 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c75b5610-6996-4d26-b251-04176399fd0b-audit\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059533 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c75b5610-6996-4d26-b251-04176399fd0b-image-import-ca\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059547 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c75b5610-6996-4d26-b251-04176399fd0b-config\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059562 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c75b5610-6996-4d26-b251-04176399fd0b-audit-dir\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059575 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpd4l\" (UniqueName: \"kubernetes.io/projected/393fa798-a141-4d7d-8c72-5289bd6f3b5c-kube-api-access-lpd4l\") pod \"authentication-operator-69f744f599-xslxh\" (UID: \"393fa798-a141-4d7d-8c72-5289bd6f3b5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059591 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-config\") pod \"controller-manager-879f6c89f-nmk6s\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059604 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nmk6s\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059618 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fb29a322-7b58-4faf-9649-fc79f35c2e52-auth-proxy-config\") pod \"machine-approver-56656f9798-kl9jq\" (UID: \"fb29a322-7b58-4faf-9649-fc79f35c2e52\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059642 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f6ed1b0f-b167-4dd9-9cfb-0687dad12d05-images\") pod \"machine-api-operator-5694c8668f-n22zb\" (UID: \"f6ed1b0f-b167-4dd9-9cfb-0687dad12d05\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n22zb" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059657 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f6ed1b0f-b167-4dd9-9cfb-0687dad12d05-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-n22zb\" (UID: \"f6ed1b0f-b167-4dd9-9cfb-0687dad12d05\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n22zb" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059691 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75b5610-6996-4d26-b251-04176399fd0b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059705 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnz7j\" (UniqueName: \"kubernetes.io/projected/f6ed1b0f-b167-4dd9-9cfb-0687dad12d05-kube-api-access-lnz7j\") pod \"machine-api-operator-5694c8668f-n22zb\" (UID: \"f6ed1b0f-b167-4dd9-9cfb-0687dad12d05\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n22zb" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059720 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c75b5610-6996-4d26-b251-04176399fd0b-etcd-client\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059733 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-client-ca\") pod \"controller-manager-879f6c89f-nmk6s\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059755 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7w4q\" (UniqueName: \"kubernetes.io/projected/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-kube-api-access-r7w4q\") pod \"controller-manager-879f6c89f-nmk6s\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059771 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/393fa798-a141-4d7d-8c72-5289bd6f3b5c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xslxh\" (UID: \"393fa798-a141-4d7d-8c72-5289bd6f3b5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059789 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsr2h\" (UniqueName: \"kubernetes.io/projected/c75b5610-6996-4d26-b251-04176399fd0b-kube-api-access-tsr2h\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059805 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c75b5610-6996-4d26-b251-04176399fd0b-encryption-config\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059820 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgmt2\" (UniqueName: \"kubernetes.io/projected/fb29a322-7b58-4faf-9649-fc79f35c2e52-kube-api-access-jgmt2\") pod \"machine-approver-56656f9798-kl9jq\" (UID: \"fb29a322-7b58-4faf-9649-fc79f35c2e52\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059835 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c75b5610-6996-4d26-b251-04176399fd0b-etcd-serving-ca\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.059850 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/393fa798-a141-4d7d-8c72-5289bd6f3b5c-config\") pod \"authentication-operator-69f744f599-xslxh\" (UID: \"393fa798-a141-4d7d-8c72-5289bd6f3b5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.060401 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/393fa798-a141-4d7d-8c72-5289bd6f3b5c-config\") pod \"authentication-operator-69f744f599-xslxh\" (UID: \"393fa798-a141-4d7d-8c72-5289bd6f3b5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.060898 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fb29a322-7b58-4faf-9649-fc79f35c2e52-auth-proxy-config\") pod \"machine-approver-56656f9798-kl9jq\" (UID: \"fb29a322-7b58-4faf-9649-fc79f35c2e52\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.061092 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nmk6s\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.061480 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/393fa798-a141-4d7d-8c72-5289bd6f3b5c-service-ca-bundle\") pod \"authentication-operator-69f744f599-xslxh\" (UID: \"393fa798-a141-4d7d-8c72-5289bd6f3b5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.062785 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6ed1b0f-b167-4dd9-9cfb-0687dad12d05-config\") pod \"machine-api-operator-5694c8668f-n22zb\" (UID: \"f6ed1b0f-b167-4dd9-9cfb-0687dad12d05\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n22zb" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.063091 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb29a322-7b58-4faf-9649-fc79f35c2e52-config\") pod \"machine-approver-56656f9798-kl9jq\" (UID: \"fb29a322-7b58-4faf-9649-fc79f35c2e52\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.063867 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.064790 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f6ed1b0f-b167-4dd9-9cfb-0687dad12d05-images\") pod \"machine-api-operator-5694c8668f-n22zb\" (UID: \"f6ed1b0f-b167-4dd9-9cfb-0687dad12d05\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n22zb" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.066911 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b4cbk"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.067883 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-r84fp"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.068085 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b4cbk" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.069061 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-client-ca\") pod \"controller-manager-879f6c89f-nmk6s\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.069628 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75b5610-6996-4d26-b251-04176399fd0b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.070440 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c75b5610-6996-4d26-b251-04176399fd0b-image-import-ca\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.071364 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c75b5610-6996-4d26-b251-04176399fd0b-audit-dir\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.071895 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c75b5610-6996-4d26-b251-04176399fd0b-config\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.082514 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c75b5610-6996-4d26-b251-04176399fd0b-encryption-config\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.072236 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/393fa798-a141-4d7d-8c72-5289bd6f3b5c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xslxh\" (UID: \"393fa798-a141-4d7d-8c72-5289bd6f3b5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.072296 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c75b5610-6996-4d26-b251-04176399fd0b-audit\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.063141 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c75b5610-6996-4d26-b251-04176399fd0b-node-pullsecrets\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.073089 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-config\") pod \"controller-manager-879f6c89f-nmk6s\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.074813 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/393fa798-a141-4d7d-8c72-5289bd6f3b5c-serving-cert\") pod \"authentication-operator-69f744f599-xslxh\" (UID: \"393fa798-a141-4d7d-8c72-5289bd6f3b5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.078366 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.078585 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c75b5610-6996-4d26-b251-04176399fd0b-etcd-client\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.072224 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c75b5610-6996-4d26-b251-04176399fd0b-etcd-serving-ca\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.083697 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-v4wzv"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.084129 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6z9p7"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.083871 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r84fp" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.084597 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-9n97f"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.084698 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6z9p7" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.084955 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c75b5610-6996-4d26-b251-04176399fd0b-serving-cert\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.084245 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.085082 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4fhrs"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.085182 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.085763 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tp4c2"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.085794 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vbfqm"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.085805 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.086004 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.086506 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.086762 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-n22zb"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.086780 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.087346 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6fgw"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.087720 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xd2f2"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.088154 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pvqbv"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.088433 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.088762 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.088826 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.088946 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.089099 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.089223 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.089382 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6fgw" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.089539 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xd2f2" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.089644 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvqbv" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.090271 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-fls4v"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.090615 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-fls4v" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.091127 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.091367 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-p9hfb"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.096565 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f6ed1b0f-b167-4dd9-9cfb-0687dad12d05-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-n22zb\" (UID: \"f6ed1b0f-b167-4dd9-9cfb-0687dad12d05\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n22zb" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.096574 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fb29a322-7b58-4faf-9649-fc79f35c2e52-machine-approver-tls\") pod \"machine-approver-56656f9798-kl9jq\" (UID: \"fb29a322-7b58-4faf-9649-fc79f35c2e52\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.096746 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-p9hfb" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.097760 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.097776 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566088-pbdgz"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.102833 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xslxh"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.102967 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-z88lc"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.102979 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566088-pbdgz" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.105828 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29566080-vq54z"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.106254 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z88lc" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.108249 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4wmcx"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.108426 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwwgf"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.108465 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.109066 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-serving-cert\") pod \"controller-manager-879f6c89f-nmk6s\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.115703 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zvttc"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.118699 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.118844 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qspsc"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.119453 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxlq5"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.120837 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b4cbk"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.121729 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.124090 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fxnfx"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.126107 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.128011 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s97mj"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.129203 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6z9p7"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.130908 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r2ssr"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.132927 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-p9hfb"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.135113 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z88lc"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.137168 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7gtj4"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.138629 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.140296 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-v4wzv"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.141118 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qnw4x"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.141693 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qnw4x" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.142340 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7smtj"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.143481 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.144249 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-mzkng"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.145771 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xd2f2"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.147344 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4fhrs"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.148943 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.150026 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ghlsv"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.151979 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bwpfb"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.153307 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.154332 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.156778 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6fgw"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.157778 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-r84fp"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.158515 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.158968 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qnw4x"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.159997 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pvqbv"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.161014 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.162079 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-fls4v"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.163906 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.165420 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566088-pbdgz"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.166998 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.168889 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7smtj"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.170382 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-pr89k"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.171299 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pr89k" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.178426 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.200001 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.205536 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566090-fczg6"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.206089 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566090-fczg6" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.208756 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566090-fczg6"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.218703 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.238297 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.258249 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.278995 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.298220 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.317939 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.339052 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.358555 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.379262 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.418921 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.439951 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.459137 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.530759 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7w4q\" (UniqueName: \"kubernetes.io/projected/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-kube-api-access-r7w4q\") pod \"controller-manager-879f6c89f-nmk6s\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.539197 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgmt2\" (UniqueName: \"kubernetes.io/projected/fb29a322-7b58-4faf-9649-fc79f35c2e52-kube-api-access-jgmt2\") pod \"machine-approver-56656f9798-kl9jq\" (UID: \"fb29a322-7b58-4faf-9649-fc79f35c2e52\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.568772 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsr2h\" (UniqueName: \"kubernetes.io/projected/c75b5610-6996-4d26-b251-04176399fd0b-kube-api-access-tsr2h\") pod \"apiserver-76f77b778f-vbfqm\" (UID: \"c75b5610-6996-4d26-b251-04176399fd0b\") " pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.585166 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpd4l\" (UniqueName: \"kubernetes.io/projected/393fa798-a141-4d7d-8c72-5289bd6f3b5c-kube-api-access-lpd4l\") pod \"authentication-operator-69f744f599-xslxh\" (UID: \"393fa798-a141-4d7d-8c72-5289bd6f3b5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.599700 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.604140 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnz7j\" (UniqueName: \"kubernetes.io/projected/f6ed1b0f-b167-4dd9-9cfb-0687dad12d05-kube-api-access-lnz7j\") pod \"machine-api-operator-5694c8668f-n22zb\" (UID: \"f6ed1b0f-b167-4dd9-9cfb-0687dad12d05\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-n22zb" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.605752 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.621607 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.639250 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.639441 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq" Mar 20 00:10:00 crc kubenswrapper[4867]: W0320 00:10:00.657747 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb29a322_7b58_4faf_9649_fc79f35c2e52.slice/crio-349ed1f59574736bc41b2be527d0e9e518be370647e95cf3d7c1884b850f0874 WatchSource:0}: Error finding container 349ed1f59574736bc41b2be527d0e9e518be370647e95cf3d7c1884b850f0874: Status 404 returned error can't find the container with id 349ed1f59574736bc41b2be527d0e9e518be370647e95cf3d7c1884b850f0874 Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.659758 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.678930 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.698861 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.719921 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.739634 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.758881 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.778412 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.796007 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xslxh"] Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.799152 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: W0320 00:10:00.802261 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod393fa798_a141_4d7d_8c72_5289bd6f3b5c.slice/crio-c6921edd463cd9689f62304f44948dd5ae56d6d888d2d988520731ff6e408823 WatchSource:0}: Error finding container c6921edd463cd9689f62304f44948dd5ae56d6d888d2d988520731ff6e408823: Status 404 returned error can't find the container with id c6921edd463cd9689f62304f44948dd5ae56d6d888d2d988520731ff6e408823 Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.804705 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.818424 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.839071 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.857663 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-n22zb" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.860002 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.864451 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.879763 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.901084 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.919846 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.938739 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.958849 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.968273 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq" event={"ID":"fb29a322-7b58-4faf-9649-fc79f35c2e52","Type":"ContainerStarted","Data":"15cb05907897830b182e1b2b1fcd065001ad99ae4337de0bcd838bd200902df6"} Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.968332 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq" event={"ID":"fb29a322-7b58-4faf-9649-fc79f35c2e52","Type":"ContainerStarted","Data":"349ed1f59574736bc41b2be527d0e9e518be370647e95cf3d7c1884b850f0874"} Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.970051 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" event={"ID":"393fa798-a141-4d7d-8c72-5289bd6f3b5c","Type":"ContainerStarted","Data":"5be9954708cc4eb939b0f47b12e64575ec3cfcc72986f967ac4a8154d2eb53a5"} Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.970074 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" event={"ID":"393fa798-a141-4d7d-8c72-5289bd6f3b5c","Type":"ContainerStarted","Data":"c6921edd463cd9689f62304f44948dd5ae56d6d888d2d988520731ff6e408823"} Mar 20 00:10:00 crc kubenswrapper[4867]: I0320 00:10:00.979442 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.002767 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.019280 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.039244 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.058925 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nmk6s"] Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.061364 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 00:10:01 crc kubenswrapper[4867]: W0320 00:10:01.071071 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a079af5_b58c_44d4_baa5_fc1bfab08cbb.slice/crio-8243a8e5435ae1e8d35eb6cca4db23e54f8d6696e921fcde1c55945fb8378e0d WatchSource:0}: Error finding container 8243a8e5435ae1e8d35eb6cca4db23e54f8d6696e921fcde1c55945fb8378e0d: Status 404 returned error can't find the container with id 8243a8e5435ae1e8d35eb6cca4db23e54f8d6696e921fcde1c55945fb8378e0d Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.079703 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.092196 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-n22zb"] Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.094156 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vbfqm"] Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.097563 4867 request.go:700] Waited for 1.011396264s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/configmaps?fieldSelector=metadata.name%3Dmarketplace-trusted-ca&limit=500&resourceVersion=0 Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.104579 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 00:10:01 crc kubenswrapper[4867]: W0320 00:10:01.115078 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc75b5610_6996_4d26_b251_04176399fd0b.slice/crio-0a33c014825c422a74bae05a0752738a9e7f93452575c573bd713a8a98e1fbf2 WatchSource:0}: Error finding container 0a33c014825c422a74bae05a0752738a9e7f93452575c573bd713a8a98e1fbf2: Status 404 returned error can't find the container with id 0a33c014825c422a74bae05a0752738a9e7f93452575c573bd713a8a98e1fbf2 Mar 20 00:10:01 crc kubenswrapper[4867]: W0320 00:10:01.115801 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6ed1b0f_b167_4dd9_9cfb_0687dad12d05.slice/crio-a8bd7a146a92ff3b429fb93d6593859291f1bd4265da27f051fb0f45f18c1e07 WatchSource:0}: Error finding container a8bd7a146a92ff3b429fb93d6593859291f1bd4265da27f051fb0f45f18c1e07: Status 404 returned error can't find the container with id a8bd7a146a92ff3b429fb93d6593859291f1bd4265da27f051fb0f45f18c1e07 Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.119005 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.138973 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.159280 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.178776 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.199009 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.218119 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.238329 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.259187 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.280467 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.298482 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.319053 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.339244 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.358384 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.378883 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.398652 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.418893 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.439266 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.458936 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.477965 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.498881 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.526574 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.538882 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.558566 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.579831 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.598586 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.619292 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.640111 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.659206 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.678557 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.699168 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.719785 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.739457 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.759411 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.781433 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.799726 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.818552 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.840604 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.859061 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.879173 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.898340 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.918760 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.938830 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.959371 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.978391 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" event={"ID":"3a079af5-b58c-44d4-baa5-fc1bfab08cbb","Type":"ContainerStarted","Data":"04c6a037613d47efe74cb9c13ba1450b716591c1b38182a8dfa36db0714fa351"} Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.978860 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" event={"ID":"3a079af5-b58c-44d4-baa5-fc1bfab08cbb","Type":"ContainerStarted","Data":"8243a8e5435ae1e8d35eb6cca4db23e54f8d6696e921fcde1c55945fb8378e0d"} Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.978891 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.982072 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.984977 4867 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-nmk6s container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.985044 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" podUID="3a079af5-b58c-44d4-baa5-fc1bfab08cbb" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.986037 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq" event={"ID":"fb29a322-7b58-4faf-9649-fc79f35c2e52","Type":"ContainerStarted","Data":"5d785288373cf2aef06f67711860dea1105e20f5b2dddc5b124c3c52e8d0f8df"} Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.987584 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-n22zb" event={"ID":"f6ed1b0f-b167-4dd9-9cfb-0687dad12d05","Type":"ContainerStarted","Data":"ca4d5c42e54de7ec84fa7cc23722f6c91e054253dc0fcd895065e2efaf6373a8"} Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.987727 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-n22zb" event={"ID":"f6ed1b0f-b167-4dd9-9cfb-0687dad12d05","Type":"ContainerStarted","Data":"9bed21e3e1b7bfaafdbd955cb318b202cdb6cfa21adf4ff56f5e4987d55b445f"} Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.987819 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-n22zb" event={"ID":"f6ed1b0f-b167-4dd9-9cfb-0687dad12d05","Type":"ContainerStarted","Data":"a8bd7a146a92ff3b429fb93d6593859291f1bd4265da27f051fb0f45f18c1e07"} Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.989161 4867 generic.go:334] "Generic (PLEG): container finished" podID="c75b5610-6996-4d26-b251-04176399fd0b" containerID="4f283fac8a737aa4ebcca249132e764018fb4ead201c586416a59755b2f9b7eb" exitCode=0 Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.989252 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" event={"ID":"c75b5610-6996-4d26-b251-04176399fd0b","Type":"ContainerDied","Data":"4f283fac8a737aa4ebcca249132e764018fb4ead201c586416a59755b2f9b7eb"} Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.989282 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" event={"ID":"c75b5610-6996-4d26-b251-04176399fd0b","Type":"ContainerStarted","Data":"0a33c014825c422a74bae05a0752738a9e7f93452575c573bd713a8a98e1fbf2"} Mar 20 00:10:01 crc kubenswrapper[4867]: I0320 00:10:01.998661 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.012889 4867 ???:1] "http: TLS handshake error from 192.168.126.11:38478: no serving certificate available for the kubelet" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.025126 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.038142 4867 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.041472 4867 ???:1] "http: TLS handshake error from 192.168.126.11:38490: no serving certificate available for the kubelet" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.058847 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.078172 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.097622 4867 request.go:700] Waited for 1.926110214s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-dockercfg-qx5rd&limit=500&resourceVersion=0 Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.098981 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.113847 4867 ???:1] "http: TLS handshake error from 192.168.126.11:38496: no serving certificate available for the kubelet" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.119006 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.139381 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.150655 4867 ???:1] "http: TLS handshake error from 192.168.126.11:38502: no serving certificate available for the kubelet" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.174819 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5df8ee0a-f645-4f19-b75c-9cd29e21be30-registry-certificates\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.174847 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.174876 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.174900 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d4db17f-0c6c-45c8-bfea-5251e2f6690d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qspsc\" (UID: \"7d4db17f-0c6c-45c8-bfea-5251e2f6690d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qspsc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.174938 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-audit-dir\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.174956 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfxxf\" (UniqueName: \"kubernetes.io/projected/5df8ee0a-f645-4f19-b75c-9cd29e21be30-kube-api-access-nfxxf\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.174976 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f35b5e9-505a-4269-a74d-7a4523284289-config\") pod \"console-operator-58897d9998-r2ssr\" (UID: \"2f35b5e9-505a-4269-a74d-7a4523284289\") " pod="openshift-console-operator/console-operator-58897d9998-r2ssr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.174998 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a6b937c9-ef34-4082-9daf-58994e3a7272-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zvttc\" (UID: \"a6b937c9-ef34-4082-9daf-58994e3a7272\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zvttc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175023 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b02d37d7-8347-4725-8964-2adad8d0e673-proxy-tls\") pod \"machine-config-operator-74547568cd-spjhj\" (UID: \"b02d37d7-8347-4725-8964-2adad8d0e673\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175127 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1656bc75-cc92-468d-b969-0fe7cb29c705-trusted-ca-bundle\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175168 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1656bc75-cc92-468d-b969-0fe7cb29c705-oauth-serving-cert\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175210 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175231 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7d2h\" (UniqueName: \"kubernetes.io/projected/3283192e-4886-4409-8a9e-dda503c25e85-kube-api-access-c7d2h\") pod \"downloads-7954f5f757-tp4c2\" (UID: \"3283192e-4886-4409-8a9e-dda503c25e85\") " pod="openshift-console/downloads-7954f5f757-tp4c2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175249 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ff274f4-3245-4179-b42f-7d113812766b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nwwgf\" (UID: \"0ff274f4-3245-4179-b42f-7d113812766b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwwgf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175264 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1656bc75-cc92-468d-b969-0fe7cb29c705-console-serving-cert\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175280 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175294 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-encryption-config\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175340 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5df8ee0a-f645-4f19-b75c-9cd29e21be30-trusted-ca\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175371 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpkfj\" (UniqueName: \"kubernetes.io/projected/2f35b5e9-505a-4269-a74d-7a4523284289-kube-api-access-xpkfj\") pod \"console-operator-58897d9998-r2ssr\" (UID: \"2f35b5e9-505a-4269-a74d-7a4523284289\") " pod="openshift-console-operator/console-operator-58897d9998-r2ssr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175386 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175403 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175422 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1656bc75-cc92-468d-b969-0fe7cb29c705-console-config\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175441 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-audit-policies\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175457 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eff46e39-23b9-4930-80d0-89d5d91ed2a4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-s97mj\" (UID: \"eff46e39-23b9-4930-80d0-89d5d91ed2a4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s97mj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175480 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6edfa360-6c55-4e5a-a26c-a78eacd4a143-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4wmcx\" (UID: \"6edfa360-6c55-4e5a-a26c-a78eacd4a143\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4wmcx" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175515 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-serving-cert\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175537 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98a0a088-a592-47fd-b92a-b73db5432a7b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zd5l2\" (UID: \"98a0a088-a592-47fd-b92a-b73db5432a7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175574 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5df8ee0a-f645-4f19-b75c-9cd29e21be30-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175718 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f35b5e9-505a-4269-a74d-7a4523284289-serving-cert\") pod \"console-operator-58897d9998-r2ssr\" (UID: \"2f35b5e9-505a-4269-a74d-7a4523284289\") " pod="openshift-console-operator/console-operator-58897d9998-r2ssr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175764 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5df8ee0a-f645-4f19-b75c-9cd29e21be30-registry-tls\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175801 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175835 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8171ee3e-3418-4537-b2a5-52941744cba6-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c8s9z\" (UID: \"8171ee3e-3418-4537-b2a5-52941744cba6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175869 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2vn8\" (UniqueName: \"kubernetes.io/projected/b02d37d7-8347-4725-8964-2adad8d0e673-kube-api-access-b2vn8\") pod \"machine-config-operator-74547568cd-spjhj\" (UID: \"b02d37d7-8347-4725-8964-2adad8d0e673\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.175962 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb1e4d21-3692-4e9d-914c-12cd30f641fa-audit-dir\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176011 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq2wf\" (UniqueName: \"kubernetes.io/projected/eff46e39-23b9-4930-80d0-89d5d91ed2a4-kube-api-access-zq2wf\") pod \"openshift-apiserver-operator-796bbdcf4f-s97mj\" (UID: \"eff46e39-23b9-4930-80d0-89d5d91ed2a4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s97mj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176053 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c74119c8-4cf1-4c7b-9dba-49421dfe52e6-serviceca\") pod \"image-pruner-29566080-vq54z\" (UID: \"c74119c8-4cf1-4c7b-9dba-49421dfe52e6\") " pod="openshift-image-registry/image-pruner-29566080-vq54z" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176084 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4gql\" (UniqueName: \"kubernetes.io/projected/8171ee3e-3418-4537-b2a5-52941744cba6-kube-api-access-f4gql\") pod \"olm-operator-6b444d44fb-c8s9z\" (UID: \"8171ee3e-3418-4537-b2a5-52941744cba6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176113 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zc95\" (UniqueName: \"kubernetes.io/projected/a6b937c9-ef34-4082-9daf-58994e3a7272-kube-api-access-2zc95\") pod \"openshift-config-operator-7777fb866f-zvttc\" (UID: \"a6b937c9-ef34-4082-9daf-58994e3a7272\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zvttc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176129 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-etcd-client\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176149 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5df8ee0a-f645-4f19-b75c-9cd29e21be30-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176164 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f35b5e9-505a-4269-a74d-7a4523284289-trusted-ca\") pod \"console-operator-58897d9998-r2ssr\" (UID: \"2f35b5e9-505a-4269-a74d-7a4523284289\") " pod="openshift-console-operator/console-operator-58897d9998-r2ssr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176181 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zlcr\" (UniqueName: \"kubernetes.io/projected/6edfa360-6c55-4e5a-a26c-a78eacd4a143-kube-api-access-8zlcr\") pod \"cluster-samples-operator-665b6dd947-4wmcx\" (UID: \"6edfa360-6c55-4e5a-a26c-a78eacd4a143\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4wmcx" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176198 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d4db17f-0c6c-45c8-bfea-5251e2f6690d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qspsc\" (UID: \"7d4db17f-0c6c-45c8-bfea-5251e2f6690d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qspsc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176231 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176247 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176262 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176277 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ff274f4-3245-4179-b42f-7d113812766b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nwwgf\" (UID: \"0ff274f4-3245-4179-b42f-7d113812766b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwwgf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176297 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b02d37d7-8347-4725-8964-2adad8d0e673-images\") pod \"machine-config-operator-74547568cd-spjhj\" (UID: \"b02d37d7-8347-4725-8964-2adad8d0e673\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176436 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176463 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8171ee3e-3418-4537-b2a5-52941744cba6-srv-cert\") pod \"olm-operator-6b444d44fb-c8s9z\" (UID: \"8171ee3e-3418-4537-b2a5-52941744cba6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z" Mar 20 00:10:02 crc kubenswrapper[4867]: E0320 00:10:02.176588 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:02.676568025 +0000 UTC m=+216.903105552 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176669 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dcsh\" (UniqueName: \"kubernetes.io/projected/fb1e4d21-3692-4e9d-914c-12cd30f641fa-kube-api-access-2dcsh\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176711 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176802 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98a0a088-a592-47fd-b92a-b73db5432a7b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zd5l2\" (UID: \"98a0a088-a592-47fd-b92a-b73db5432a7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176828 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq7vc\" (UniqueName: \"kubernetes.io/projected/0ff274f4-3245-4179-b42f-7d113812766b-kube-api-access-hq7vc\") pod \"openshift-controller-manager-operator-756b6f6bc6-nwwgf\" (UID: \"0ff274f4-3245-4179-b42f-7d113812766b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwwgf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176849 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ltv5\" (UniqueName: \"kubernetes.io/projected/5fceb997-3744-48c3-94fd-4960c139b01d-kube-api-access-2ltv5\") pod \"dns-operator-744455d44c-fxnfx\" (UID: \"5fceb997-3744-48c3-94fd-4960c139b01d\") " pod="openshift-dns-operator/dns-operator-744455d44c-fxnfx" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176865 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6b937c9-ef34-4082-9daf-58994e3a7272-serving-cert\") pod \"openshift-config-operator-7777fb866f-zvttc\" (UID: \"a6b937c9-ef34-4082-9daf-58994e3a7272\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zvttc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176968 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3c795069-a3a9-4cdd-994d-23b51b69f386-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qxlq5\" (UID: \"3c795069-a3a9-4cdd-994d-23b51b69f386\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxlq5" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.176999 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swmz9\" (UniqueName: \"kubernetes.io/projected/3c795069-a3a9-4cdd-994d-23b51b69f386-kube-api-access-swmz9\") pod \"control-plane-machine-set-operator-78cbb6b69f-qxlq5\" (UID: \"3c795069-a3a9-4cdd-994d-23b51b69f386\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxlq5" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.177203 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.177278 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d4db17f-0c6c-45c8-bfea-5251e2f6690d-config\") pod \"kube-controller-manager-operator-78b949d7b-qspsc\" (UID: \"7d4db17f-0c6c-45c8-bfea-5251e2f6690d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qspsc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.177312 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eff46e39-23b9-4930-80d0-89d5d91ed2a4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-s97mj\" (UID: \"eff46e39-23b9-4930-80d0-89d5d91ed2a4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s97mj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.177352 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fceb997-3744-48c3-94fd-4960c139b01d-metrics-tls\") pod \"dns-operator-744455d44c-fxnfx\" (UID: \"5fceb997-3744-48c3-94fd-4960c139b01d\") " pod="openshift-dns-operator/dns-operator-744455d44c-fxnfx" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.177370 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw62n\" (UniqueName: \"kubernetes.io/projected/c74119c8-4cf1-4c7b-9dba-49421dfe52e6-kube-api-access-zw62n\") pod \"image-pruner-29566080-vq54z\" (UID: \"c74119c8-4cf1-4c7b-9dba-49421dfe52e6\") " pod="openshift-image-registry/image-pruner-29566080-vq54z" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.177392 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-audit-policies\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.177412 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1656bc75-cc92-468d-b969-0fe7cb29c705-console-oauth-config\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.177429 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.177447 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1656bc75-cc92-468d-b969-0fe7cb29c705-service-ca\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.177463 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b02d37d7-8347-4725-8964-2adad8d0e673-auth-proxy-config\") pod \"machine-config-operator-74547568cd-spjhj\" (UID: \"b02d37d7-8347-4725-8964-2adad8d0e673\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.177500 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/98a0a088-a592-47fd-b92a-b73db5432a7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zd5l2\" (UID: \"98a0a088-a592-47fd-b92a-b73db5432a7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.177518 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7jtk\" (UniqueName: \"kubernetes.io/projected/98a0a088-a592-47fd-b92a-b73db5432a7b-kube-api-access-z7jtk\") pod \"cluster-image-registry-operator-dc59b4c8b-zd5l2\" (UID: \"98a0a088-a592-47fd-b92a-b73db5432a7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.177535 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5m2b\" (UniqueName: \"kubernetes.io/projected/1656bc75-cc92-468d-b969-0fe7cb29c705-kube-api-access-j5m2b\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.177553 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-679w4\" (UniqueName: \"kubernetes.io/projected/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-kube-api-access-679w4\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.177570 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5df8ee0a-f645-4f19-b75c-9cd29e21be30-bound-sa-token\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.177587 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpfdq\" (UniqueName: \"kubernetes.io/projected/42544638-2b4a-4acd-bd73-e1fc16a3c921-kube-api-access-xpfdq\") pod \"migrator-59844c95c7-ghlsv\" (UID: \"42544638-2b4a-4acd-bd73-e1fc16a3c921\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ghlsv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.219745 4867 ???:1] "http: TLS handshake error from 192.168.126.11:38514: no serving certificate available for the kubelet" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278208 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:02 crc kubenswrapper[4867]: E0320 00:10:02.278359 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:02.778331126 +0000 UTC m=+217.004868653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278397 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/585bf14f-1828-4820-b5b2-2aa8992b4499-apiservice-cert\") pod \"packageserver-d55dfcdfc-fplqr\" (UID: \"585bf14f-1828-4820-b5b2-2aa8992b4499\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278427 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2635e37-95ea-4edd-a446-2143480b4052-proxy-tls\") pod \"machine-config-controller-84d6567774-r84fp\" (UID: \"a2635e37-95ea-4edd-a446-2143480b4052\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r84fp" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278453 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq7vc\" (UniqueName: \"kubernetes.io/projected/0ff274f4-3245-4179-b42f-7d113812766b-kube-api-access-hq7vc\") pod \"openshift-controller-manager-operator-756b6f6bc6-nwwgf\" (UID: \"0ff274f4-3245-4179-b42f-7d113812766b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwwgf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278471 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxz4c\" (UniqueName: \"kubernetes.io/projected/0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a-kube-api-access-cxz4c\") pod \"ingress-operator-5b745b69d9-4v7zf\" (UID: \"0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278524 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6b937c9-ef34-4082-9daf-58994e3a7272-serving-cert\") pod \"openshift-config-operator-7777fb866f-zvttc\" (UID: \"a6b937c9-ef34-4082-9daf-58994e3a7272\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zvttc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278544 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3c795069-a3a9-4cdd-994d-23b51b69f386-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qxlq5\" (UID: \"3c795069-a3a9-4cdd-994d-23b51b69f386\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxlq5" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278561 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swmz9\" (UniqueName: \"kubernetes.io/projected/3c795069-a3a9-4cdd-994d-23b51b69f386-kube-api-access-swmz9\") pod \"control-plane-machine-set-operator-78cbb6b69f-qxlq5\" (UID: \"3c795069-a3a9-4cdd-994d-23b51b69f386\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxlq5" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278581 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/330066b4-36c9-4414-aa1d-788a59b3ce13-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-z6fgw\" (UID: \"330066b4-36c9-4414-aa1d-788a59b3ce13\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6fgw" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278611 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278657 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c335827-0757-4eb8-83ec-13fdfd9ee948-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-b4cbk\" (UID: \"7c335827-0757-4eb8-83ec-13fdfd9ee948\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b4cbk" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278681 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fceb997-3744-48c3-94fd-4960c139b01d-metrics-tls\") pod \"dns-operator-744455d44c-fxnfx\" (UID: \"5fceb997-3744-48c3-94fd-4960c139b01d\") " pod="openshift-dns-operator/dns-operator-744455d44c-fxnfx" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278699 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eff46e39-23b9-4930-80d0-89d5d91ed2a4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-s97mj\" (UID: \"eff46e39-23b9-4930-80d0-89d5d91ed2a4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s97mj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278720 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4v7zf\" (UID: \"0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278737 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg95v\" (UniqueName: \"kubernetes.io/projected/d7f6765c-8450-44ad-895b-0c6b30ed4aa3-kube-api-access-cg95v\") pod \"service-ca-9c57cc56f-fls4v\" (UID: \"d7f6765c-8450-44ad-895b-0c6b30ed4aa3\") " pod="openshift-service-ca/service-ca-9c57cc56f-fls4v" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278754 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7912604b-d8ba-4510-9694-c6778f3d1c8e-metrics-certs\") pod \"router-default-5444994796-9n97f\" (UID: \"7912604b-d8ba-4510-9694-c6778f3d1c8e\") " pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278773 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/056f78fa-8b9c-4997-9d3a-00103d32eb09-etcd-service-ca\") pod \"etcd-operator-b45778765-v4wzv\" (UID: \"056f78fa-8b9c-4997-9d3a-00103d32eb09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278800 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b02d37d7-8347-4725-8964-2adad8d0e673-auth-proxy-config\") pod \"machine-config-operator-74547568cd-spjhj\" (UID: \"b02d37d7-8347-4725-8964-2adad8d0e673\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278815 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4da3cc8b-1975-4899-a45a-465cd4c5170d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6z9p7\" (UID: \"4da3cc8b-1975-4899-a45a-465cd4c5170d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6z9p7" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278830 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5df8ee0a-f645-4f19-b75c-9cd29e21be30-bound-sa-token\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278845 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/98a0a088-a592-47fd-b92a-b73db5432a7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zd5l2\" (UID: \"98a0a088-a592-47fd-b92a-b73db5432a7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278860 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7jtk\" (UniqueName: \"kubernetes.io/projected/98a0a088-a592-47fd-b92a-b73db5432a7b-kube-api-access-z7jtk\") pod \"cluster-image-registry-operator-dc59b4c8b-zd5l2\" (UID: \"98a0a088-a592-47fd-b92a-b73db5432a7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278877 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5m2b\" (UniqueName: \"kubernetes.io/projected/1656bc75-cc92-468d-b969-0fe7cb29c705-kube-api-access-j5m2b\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278894 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/774b2ca8-d862-4e99-ab70-ca68cfc5122c-registration-dir\") pod \"csi-hostpathplugin-7smtj\" (UID: \"774b2ca8-d862-4e99-ab70-ca68cfc5122c\") " pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278909 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/774b2ca8-d862-4e99-ab70-ca68cfc5122c-csi-data-dir\") pod \"csi-hostpathplugin-7smtj\" (UID: \"774b2ca8-d862-4e99-ab70-ca68cfc5122c\") " pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278928 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5df8ee0a-f645-4f19-b75c-9cd29e21be30-registry-certificates\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278943 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278958 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfxxf\" (UniqueName: \"kubernetes.io/projected/5df8ee0a-f645-4f19-b75c-9cd29e21be30-kube-api-access-nfxxf\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278972 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6eb876e4-bbb0-4359-9fc7-eea081c878e1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-p9hfb\" (UID: \"6eb876e4-bbb0-4359-9fc7-eea081c878e1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-p9hfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.278990 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a6b937c9-ef34-4082-9daf-58994e3a7272-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zvttc\" (UID: \"a6b937c9-ef34-4082-9daf-58994e3a7272\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zvttc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279006 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b02d37d7-8347-4725-8964-2adad8d0e673-proxy-tls\") pod \"machine-config-operator-74547568cd-spjhj\" (UID: \"b02d37d7-8347-4725-8964-2adad8d0e673\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279024 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1656bc75-cc92-468d-b969-0fe7cb29c705-trusted-ca-bundle\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279040 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279055 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xv2v\" (UniqueName: \"kubernetes.io/projected/330066b4-36c9-4414-aa1d-788a59b3ce13-kube-api-access-9xv2v\") pod \"kube-storage-version-migrator-operator-b67b599dd-z6fgw\" (UID: \"330066b4-36c9-4414-aa1d-788a59b3ce13\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6fgw" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279070 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4fhrs\" (UID: \"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279088 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5df8ee0a-f645-4f19-b75c-9cd29e21be30-trusted-ca\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279102 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-encryption-config\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279116 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e98ce093-beb3-4d6c-be7b-8afa1e7b514e-config-volume\") pod \"collect-profiles-29566080-jpqpz\" (UID: \"e98ce093-beb3-4d6c-be7b-8afa1e7b514e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279134 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279150 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1656bc75-cc92-468d-b969-0fe7cb29c705-console-config\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279167 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6edfa360-6c55-4e5a-a26c-a78eacd4a143-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4wmcx\" (UID: \"6edfa360-6c55-4e5a-a26c-a78eacd4a143\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4wmcx" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279182 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-serving-cert\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279198 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f5af618f-1da2-4ef1-897c-6269de1ff17a-node-bootstrap-token\") pod \"machine-config-server-pr89k\" (UID: \"f5af618f-1da2-4ef1-897c-6269de1ff17a\") " pod="openshift-machine-config-operator/machine-config-server-pr89k" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279211 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4da3cc8b-1975-4899-a45a-465cd4c5170d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6z9p7\" (UID: \"4da3cc8b-1975-4899-a45a-465cd4c5170d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6z9p7" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279242 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/330066b4-36c9-4414-aa1d-788a59b3ce13-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-z6fgw\" (UID: \"330066b4-36c9-4414-aa1d-788a59b3ce13\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6fgw" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279387 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a2635e37-95ea-4edd-a446-2143480b4052-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-r84fp\" (UID: \"a2635e37-95ea-4edd-a446-2143480b4052\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r84fp" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279435 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5df8ee0a-f645-4f19-b75c-9cd29e21be30-registry-tls\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279453 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm7lb\" (UniqueName: \"kubernetes.io/projected/a4b1008c-a714-49a2-8c17-e971afc302af-kube-api-access-sm7lb\") pod \"auto-csr-approver-29566090-fczg6\" (UID: \"a4b1008c-a714-49a2-8c17-e971afc302af\") " pod="openshift-infra/auto-csr-approver-29566090-fczg6" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279470 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279501 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2vn8\" (UniqueName: \"kubernetes.io/projected/b02d37d7-8347-4725-8964-2adad8d0e673-kube-api-access-b2vn8\") pod \"machine-config-operator-74547568cd-spjhj\" (UID: \"b02d37d7-8347-4725-8964-2adad8d0e673\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279520 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7921e429-29e3-4d4e-bddd-8b99537ca63f-client-ca\") pod \"route-controller-manager-6576b87f9c-4s8b4\" (UID: \"7921e429-29e3-4d4e-bddd-8b99537ca63f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.279539 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vqk6\" (UniqueName: \"kubernetes.io/projected/2b014b7a-971f-468b-ab6d-ff31adaae80c-kube-api-access-4vqk6\") pod \"package-server-manager-789f6589d5-xd2f2\" (UID: \"2b014b7a-971f-468b-ab6d-ff31adaae80c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xd2f2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.288240 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1656bc75-cc92-468d-b969-0fe7cb29c705-trusted-ca-bundle\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.288743 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1656bc75-cc92-468d-b969-0fe7cb29c705-console-config\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.289082 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b02d37d7-8347-4725-8964-2adad8d0e673-auth-proxy-config\") pod \"machine-config-operator-74547568cd-spjhj\" (UID: \"b02d37d7-8347-4725-8964-2adad8d0e673\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.290918 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3c795069-a3a9-4cdd-994d-23b51b69f386-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qxlq5\" (UID: \"3c795069-a3a9-4cdd-994d-23b51b69f386\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxlq5" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.291722 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-serving-cert\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.291791 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-972ms\" (UniqueName: \"kubernetes.io/projected/a2635e37-95ea-4edd-a446-2143480b4052-kube-api-access-972ms\") pod \"machine-config-controller-84d6567774-r84fp\" (UID: \"a2635e37-95ea-4edd-a446-2143480b4052\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r84fp" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.293516 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.294480 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5df8ee0a-f645-4f19-b75c-9cd29e21be30-registry-certificates\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.297297 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.299046 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq2wf\" (UniqueName: \"kubernetes.io/projected/eff46e39-23b9-4930-80d0-89d5d91ed2a4-kube-api-access-zq2wf\") pod \"openshift-apiserver-operator-796bbdcf4f-s97mj\" (UID: \"eff46e39-23b9-4930-80d0-89d5d91ed2a4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s97mj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.299098 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c335827-0757-4eb8-83ec-13fdfd9ee948-config\") pod \"kube-apiserver-operator-766d6c64bb-b4cbk\" (UID: \"7c335827-0757-4eb8-83ec-13fdfd9ee948\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b4cbk" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.299125 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c74119c8-4cf1-4c7b-9dba-49421dfe52e6-serviceca\") pod \"image-pruner-29566080-vq54z\" (UID: \"c74119c8-4cf1-4c7b-9dba-49421dfe52e6\") " pod="openshift-image-registry/image-pruner-29566080-vq54z" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.299145 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/774b2ca8-d862-4e99-ab70-ca68cfc5122c-socket-dir\") pod \"csi-hostpathplugin-7smtj\" (UID: \"774b2ca8-d862-4e99-ab70-ca68cfc5122c\") " pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.299162 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zc95\" (UniqueName: \"kubernetes.io/projected/a6b937c9-ef34-4082-9daf-58994e3a7272-kube-api-access-2zc95\") pod \"openshift-config-operator-7777fb866f-zvttc\" (UID: \"a6b937c9-ef34-4082-9daf-58994e3a7272\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zvttc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.299180 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-etcd-client\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.299200 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bms72\" (UniqueName: \"kubernetes.io/projected/e91db560-3692-4697-bbf0-3c5a8438d5e5-kube-api-access-bms72\") pod \"auto-csr-approver-29566088-pbdgz\" (UID: \"e91db560-3692-4697-bbf0-3c5a8438d5e5\") " pod="openshift-infra/auto-csr-approver-29566088-pbdgz" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.299302 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hszj\" (UniqueName: \"kubernetes.io/projected/cfaa1b05-7c04-42f5-a077-6dc6b19084b0-kube-api-access-6hszj\") pod \"dns-default-z88lc\" (UID: \"cfaa1b05-7c04-42f5-a077-6dc6b19084b0\") " pod="openshift-dns/dns-default-z88lc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.299332 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.299352 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f35b5e9-505a-4269-a74d-7a4523284289-trusted-ca\") pod \"console-operator-58897d9998-r2ssr\" (UID: \"2f35b5e9-505a-4269-a74d-7a4523284289\") " pod="openshift-console-operator/console-operator-58897d9998-r2ssr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.299371 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zlcr\" (UniqueName: \"kubernetes.io/projected/6edfa360-6c55-4e5a-a26c-a78eacd4a143-kube-api-access-8zlcr\") pod \"cluster-samples-operator-665b6dd947-4wmcx\" (UID: \"6edfa360-6c55-4e5a-a26c-a78eacd4a143\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4wmcx" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.299392 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.299412 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.299428 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/056f78fa-8b9c-4997-9d3a-00103d32eb09-etcd-ca\") pod \"etcd-operator-b45778765-v4wzv\" (UID: \"056f78fa-8b9c-4997-9d3a-00103d32eb09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.299451 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7912604b-d8ba-4510-9694-c6778f3d1c8e-stats-auth\") pod \"router-default-5444994796-9n97f\" (UID: \"7912604b-d8ba-4510-9694-c6778f3d1c8e\") " pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.299470 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.299508 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dcsh\" (UniqueName: \"kubernetes.io/projected/fb1e4d21-3692-4e9d-914c-12cd30f641fa-kube-api-access-2dcsh\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.300806 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5df8ee0a-f645-4f19-b75c-9cd29e21be30-trusted-ca\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.303720 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6edfa360-6c55-4e5a-a26c-a78eacd4a143-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4wmcx\" (UID: \"6edfa360-6c55-4e5a-a26c-a78eacd4a143\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4wmcx" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.306203 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6b937c9-ef34-4082-9daf-58994e3a7272-serving-cert\") pod \"openshift-config-operator-7777fb866f-zvttc\" (UID: \"a6b937c9-ef34-4082-9daf-58994e3a7272\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zvttc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.306444 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a6b937c9-ef34-4082-9daf-58994e3a7272-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zvttc\" (UID: \"a6b937c9-ef34-4082-9daf-58994e3a7272\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zvttc" Mar 20 00:10:02 crc kubenswrapper[4867]: E0320 00:10:02.306884 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:02.806867187 +0000 UTC m=+217.033404704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.306902 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-encryption-config\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.307401 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.308381 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ltv5\" (UniqueName: \"kubernetes.io/projected/5fceb997-3744-48c3-94fd-4960c139b01d-kube-api-access-2ltv5\") pod \"dns-operator-744455d44c-fxnfx\" (UID: \"5fceb997-3744-48c3-94fd-4960c139b01d\") " pod="openshift-dns-operator/dns-operator-744455d44c-fxnfx" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.308657 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.308720 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98a0a088-a592-47fd-b92a-b73db5432a7b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zd5l2\" (UID: \"98a0a088-a592-47fd-b92a-b73db5432a7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.308758 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6817894e-4517-4622-9be5-3b6b329b4d31-config\") pod \"service-ca-operator-777779d784-pvqbv\" (UID: \"6817894e-4517-4622-9be5-3b6b329b4d31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvqbv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.308803 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r28kj\" (UniqueName: \"kubernetes.io/projected/cd4e6e62-b584-4d8b-b697-800317b3a3a1-kube-api-access-r28kj\") pod \"ingress-canary-qnw4x\" (UID: \"cd4e6e62-b584-4d8b-b697-800317b3a3a1\") " pod="openshift-ingress-canary/ingress-canary-qnw4x" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.308842 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cfaa1b05-7c04-42f5-a077-6dc6b19084b0-metrics-tls\") pod \"dns-default-z88lc\" (UID: \"cfaa1b05-7c04-42f5-a077-6dc6b19084b0\") " pod="openshift-dns/dns-default-z88lc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.308713 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c74119c8-4cf1-4c7b-9dba-49421dfe52e6-serviceca\") pod \"image-pruner-29566080-vq54z\" (UID: \"c74119c8-4cf1-4c7b-9dba-49421dfe52e6\") " pod="openshift-image-registry/image-pruner-29566080-vq54z" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.308923 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d4db17f-0c6c-45c8-bfea-5251e2f6690d-config\") pod \"kube-controller-manager-operator-78b949d7b-qspsc\" (UID: \"7d4db17f-0c6c-45c8-bfea-5251e2f6690d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qspsc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.308951 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b014b7a-971f-468b-ab6d-ff31adaae80c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xd2f2\" (UID: \"2b014b7a-971f-468b-ab6d-ff31adaae80c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xd2f2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.308990 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f5af618f-1da2-4ef1-897c-6269de1ff17a-certs\") pod \"machine-config-server-pr89k\" (UID: \"f5af618f-1da2-4ef1-897c-6269de1ff17a\") " pod="openshift-machine-config-operator/machine-config-server-pr89k" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.309019 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2w8w\" (UniqueName: \"kubernetes.io/projected/e98ce093-beb3-4d6c-be7b-8afa1e7b514e-kube-api-access-k2w8w\") pod \"collect-profiles-29566080-jpqpz\" (UID: \"e98ce093-beb3-4d6c-be7b-8afa1e7b514e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.309054 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7912604b-d8ba-4510-9694-c6778f3d1c8e-service-ca-bundle\") pod \"router-default-5444994796-9n97f\" (UID: \"7912604b-d8ba-4510-9694-c6778f3d1c8e\") " pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.309155 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-audit-policies\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.309185 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw62n\" (UniqueName: \"kubernetes.io/projected/c74119c8-4cf1-4c7b-9dba-49421dfe52e6-kube-api-access-zw62n\") pod \"image-pruner-29566080-vq54z\" (UID: \"c74119c8-4cf1-4c7b-9dba-49421dfe52e6\") " pod="openshift-image-registry/image-pruner-29566080-vq54z" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.309236 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1656bc75-cc92-468d-b969-0fe7cb29c705-console-oauth-config\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.309267 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.309293 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1656bc75-cc92-468d-b969-0fe7cb29c705-service-ca\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.309322 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpfdq\" (UniqueName: \"kubernetes.io/projected/42544638-2b4a-4acd-bd73-e1fc16a3c921-kube-api-access-xpfdq\") pod \"migrator-59844c95c7-ghlsv\" (UID: \"42544638-2b4a-4acd-bd73-e1fc16a3c921\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ghlsv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.309350 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-679w4\" (UniqueName: \"kubernetes.io/projected/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-kube-api-access-679w4\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.309379 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d7f6765c-8450-44ad-895b-0c6b30ed4aa3-signing-key\") pod \"service-ca-9c57cc56f-fls4v\" (UID: \"d7f6765c-8450-44ad-895b-0c6b30ed4aa3\") " pod="openshift-service-ca/service-ca-9c57cc56f-fls4v" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.311167 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfaa1b05-7c04-42f5-a077-6dc6b19084b0-config-volume\") pod \"dns-default-z88lc\" (UID: \"cfaa1b05-7c04-42f5-a077-6dc6b19084b0\") " pod="openshift-dns/dns-default-z88lc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.312211 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-etcd-client\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.312630 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b02d37d7-8347-4725-8964-2adad8d0e673-proxy-tls\") pod \"machine-config-operator-74547568cd-spjhj\" (UID: \"b02d37d7-8347-4725-8964-2adad8d0e673\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.312787 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98a0a088-a592-47fd-b92a-b73db5432a7b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zd5l2\" (UID: \"98a0a088-a592-47fd-b92a-b73db5432a7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.313139 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.314241 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5df8ee0a-f645-4f19-b75c-9cd29e21be30-registry-tls\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.314374 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f35b5e9-505a-4269-a74d-7a4523284289-trusted-ca\") pod \"console-operator-58897d9998-r2ssr\" (UID: \"2f35b5e9-505a-4269-a74d-7a4523284289\") " pod="openshift-console-operator/console-operator-58897d9998-r2ssr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.314861 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-audit-policies\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.315117 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.315320 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/056f78fa-8b9c-4997-9d3a-00103d32eb09-config\") pod \"etcd-operator-b45778765-v4wzv\" (UID: \"056f78fa-8b9c-4997-9d3a-00103d32eb09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.315616 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.315670 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.315689 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1656bc75-cc92-468d-b969-0fe7cb29c705-console-oauth-config\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.315944 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d4db17f-0c6c-45c8-bfea-5251e2f6690d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qspsc\" (UID: \"7d4db17f-0c6c-45c8-bfea-5251e2f6690d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qspsc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.315950 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d4db17f-0c6c-45c8-bfea-5251e2f6690d-config\") pod \"kube-controller-manager-operator-78b949d7b-qspsc\" (UID: \"7d4db17f-0c6c-45c8-bfea-5251e2f6690d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qspsc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.316154 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-audit-dir\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.316263 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6817894e-4517-4622-9be5-3b6b329b4d31-serving-cert\") pod \"service-ca-operator-777779d784-pvqbv\" (UID: \"6817894e-4517-4622-9be5-3b6b329b4d31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvqbv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.316310 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-audit-dir\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.316359 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f35b5e9-505a-4269-a74d-7a4523284289-config\") pod \"console-operator-58897d9998-r2ssr\" (UID: \"2f35b5e9-505a-4269-a74d-7a4523284289\") " pod="openshift-console-operator/console-operator-58897d9998-r2ssr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.316395 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a-trusted-ca\") pod \"ingress-operator-5b745b69d9-4v7zf\" (UID: \"0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.316398 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.316443 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzdk9\" (UniqueName: \"kubernetes.io/projected/056f78fa-8b9c-4997-9d3a-00103d32eb09-kube-api-access-hzdk9\") pod \"etcd-operator-b45778765-v4wzv\" (UID: \"056f78fa-8b9c-4997-9d3a-00103d32eb09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.316514 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/585bf14f-1828-4820-b5b2-2aa8992b4499-tmpfs\") pod \"packageserver-d55dfcdfc-fplqr\" (UID: \"585bf14f-1828-4820-b5b2-2aa8992b4499\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.316583 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1656bc75-cc92-468d-b969-0fe7cb29c705-oauth-serving-cert\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.316656 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/774b2ca8-d862-4e99-ab70-ca68cfc5122c-mountpoint-dir\") pod \"csi-hostpathplugin-7smtj\" (UID: \"774b2ca8-d862-4e99-ab70-ca68cfc5122c\") " pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.317366 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f35b5e9-505a-4269-a74d-7a4523284289-config\") pod \"console-operator-58897d9998-r2ssr\" (UID: \"2f35b5e9-505a-4269-a74d-7a4523284289\") " pod="openshift-console-operator/console-operator-58897d9998-r2ssr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.317381 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1656bc75-cc92-468d-b969-0fe7cb29c705-oauth-serving-cert\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.317478 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7d2h\" (UniqueName: \"kubernetes.io/projected/3283192e-4886-4409-8a9e-dda503c25e85-kube-api-access-c7d2h\") pod \"downloads-7954f5f757-tp4c2\" (UID: \"3283192e-4886-4409-8a9e-dda503c25e85\") " pod="openshift-console/downloads-7954f5f757-tp4c2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.317563 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzmth\" (UniqueName: \"kubernetes.io/projected/4cb00f5a-6c08-44bf-9f41-fd4160b07716-kube-api-access-kzmth\") pod \"catalog-operator-68c6474976-2dzk5\" (UID: \"4cb00f5a-6c08-44bf-9f41-fd4160b07716\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.317696 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ff274f4-3245-4179-b42f-7d113812766b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nwwgf\" (UID: \"0ff274f4-3245-4179-b42f-7d113812766b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwwgf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.317738 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1656bc75-cc92-468d-b969-0fe7cb29c705-console-serving-cert\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.317792 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.317826 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c335827-0757-4eb8-83ec-13fdfd9ee948-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-b4cbk\" (UID: \"7c335827-0757-4eb8-83ec-13fdfd9ee948\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b4cbk" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.317914 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpkfj\" (UniqueName: \"kubernetes.io/projected/2f35b5e9-505a-4269-a74d-7a4523284289-kube-api-access-xpkfj\") pod \"console-operator-58897d9998-r2ssr\" (UID: \"2f35b5e9-505a-4269-a74d-7a4523284289\") " pod="openshift-console-operator/console-operator-58897d9998-r2ssr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.317950 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.318034 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szdwd\" (UniqueName: \"kubernetes.io/projected/7921e429-29e3-4d4e-bddd-8b99537ca63f-kube-api-access-szdwd\") pod \"route-controller-manager-6576b87f9c-4s8b4\" (UID: \"7921e429-29e3-4d4e-bddd-8b99537ca63f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.318087 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-audit-policies\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.318118 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98a0a088-a592-47fd-b92a-b73db5432a7b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zd5l2\" (UID: \"98a0a088-a592-47fd-b92a-b73db5432a7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.318228 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eff46e39-23b9-4930-80d0-89d5d91ed2a4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-s97mj\" (UID: \"eff46e39-23b9-4930-80d0-89d5d91ed2a4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s97mj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.318271 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9dbm\" (UniqueName: \"kubernetes.io/projected/585bf14f-1828-4820-b5b2-2aa8992b4499-kube-api-access-j9dbm\") pod \"packageserver-d55dfcdfc-fplqr\" (UID: \"585bf14f-1828-4820-b5b2-2aa8992b4499\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.318288 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ff274f4-3245-4179-b42f-7d113812766b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nwwgf\" (UID: \"0ff274f4-3245-4179-b42f-7d113812766b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwwgf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.318302 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5df8ee0a-f645-4f19-b75c-9cd29e21be30-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.318334 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5q8j\" (UniqueName: \"kubernetes.io/projected/7912604b-d8ba-4510-9694-c6778f3d1c8e-kube-api-access-c5q8j\") pod \"router-default-5444994796-9n97f\" (UID: \"7912604b-d8ba-4510-9694-c6778f3d1c8e\") " pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.318967 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.319549 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.319643 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eff46e39-23b9-4930-80d0-89d5d91ed2a4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-s97mj\" (UID: \"eff46e39-23b9-4930-80d0-89d5d91ed2a4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s97mj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.320121 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5df8ee0a-f645-4f19-b75c-9cd29e21be30-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.320222 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-audit-policies\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.320450 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f35b5e9-505a-4269-a74d-7a4523284289-serving-cert\") pod \"console-operator-58897d9998-r2ssr\" (UID: \"2f35b5e9-505a-4269-a74d-7a4523284289\") " pod="openshift-console-operator/console-operator-58897d9998-r2ssr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.320590 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4cb00f5a-6c08-44bf-9f41-fd4160b07716-profile-collector-cert\") pod \"catalog-operator-68c6474976-2dzk5\" (UID: \"4cb00f5a-6c08-44bf-9f41-fd4160b07716\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.320920 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8171ee3e-3418-4537-b2a5-52941744cba6-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c8s9z\" (UID: \"8171ee3e-3418-4537-b2a5-52941744cba6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.321029 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzph2\" (UniqueName: \"kubernetes.io/projected/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce-kube-api-access-hzph2\") pod \"marketplace-operator-79b997595-4fhrs\" (UID: \"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.321258 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2g8b\" (UniqueName: \"kubernetes.io/projected/6817894e-4517-4622-9be5-3b6b329b4d31-kube-api-access-l2g8b\") pod \"service-ca-operator-777779d784-pvqbv\" (UID: \"6817894e-4517-4622-9be5-3b6b329b4d31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvqbv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.321338 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb1e4d21-3692-4e9d-914c-12cd30f641fa-audit-dir\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.321385 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/056f78fa-8b9c-4997-9d3a-00103d32eb09-serving-cert\") pod \"etcd-operator-b45778765-v4wzv\" (UID: \"056f78fa-8b9c-4997-9d3a-00103d32eb09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.321444 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e98ce093-beb3-4d6c-be7b-8afa1e7b514e-secret-volume\") pod \"collect-profiles-29566080-jpqpz\" (UID: \"e98ce093-beb3-4d6c-be7b-8afa1e7b514e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.321478 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4cb00f5a-6c08-44bf-9f41-fd4160b07716-srv-cert\") pod \"catalog-operator-68c6474976-2dzk5\" (UID: \"4cb00f5a-6c08-44bf-9f41-fd4160b07716\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.321538 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7921e429-29e3-4d4e-bddd-8b99537ca63f-config\") pod \"route-controller-manager-6576b87f9c-4s8b4\" (UID: \"7921e429-29e3-4d4e-bddd-8b99537ca63f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.321560 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7921e429-29e3-4d4e-bddd-8b99537ca63f-serving-cert\") pod \"route-controller-manager-6576b87f9c-4s8b4\" (UID: \"7921e429-29e3-4d4e-bddd-8b99537ca63f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.321582 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7912604b-d8ba-4510-9694-c6778f3d1c8e-default-certificate\") pod \"router-default-5444994796-9n97f\" (UID: \"7912604b-d8ba-4510-9694-c6778f3d1c8e\") " pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.321625 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/585bf14f-1828-4820-b5b2-2aa8992b4499-webhook-cert\") pod \"packageserver-d55dfcdfc-fplqr\" (UID: \"585bf14f-1828-4820-b5b2-2aa8992b4499\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.321653 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd4nd\" (UniqueName: \"kubernetes.io/projected/774b2ca8-d862-4e99-ab70-ca68cfc5122c-kube-api-access-bd4nd\") pod \"csi-hostpathplugin-7smtj\" (UID: \"774b2ca8-d862-4e99-ab70-ca68cfc5122c\") " pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.321675 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4da3cc8b-1975-4899-a45a-465cd4c5170d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6z9p7\" (UID: \"4da3cc8b-1975-4899-a45a-465cd4c5170d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6z9p7" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.321708 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4gql\" (UniqueName: \"kubernetes.io/projected/8171ee3e-3418-4537-b2a5-52941744cba6-kube-api-access-f4gql\") pod \"olm-operator-6b444d44fb-c8s9z\" (UID: \"8171ee3e-3418-4537-b2a5-52941744cba6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.321741 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb1e4d21-3692-4e9d-914c-12cd30f641fa-audit-dir\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.321940 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a-metrics-tls\") pod \"ingress-operator-5b745b69d9-4v7zf\" (UID: \"0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.321994 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzjzz\" (UniqueName: \"kubernetes.io/projected/f5af618f-1da2-4ef1-897c-6269de1ff17a-kube-api-access-kzjzz\") pod \"machine-config-server-pr89k\" (UID: \"f5af618f-1da2-4ef1-897c-6269de1ff17a\") " pod="openshift-machine-config-operator/machine-config-server-pr89k" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.322015 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d7f6765c-8450-44ad-895b-0c6b30ed4aa3-signing-cabundle\") pod \"service-ca-9c57cc56f-fls4v\" (UID: \"d7f6765c-8450-44ad-895b-0c6b30ed4aa3\") " pod="openshift-service-ca/service-ca-9c57cc56f-fls4v" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.322042 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/056f78fa-8b9c-4997-9d3a-00103d32eb09-etcd-client\") pod \"etcd-operator-b45778765-v4wzv\" (UID: \"056f78fa-8b9c-4997-9d3a-00103d32eb09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.322076 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5df8ee0a-f645-4f19-b75c-9cd29e21be30-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.322128 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d4db17f-0c6c-45c8-bfea-5251e2f6690d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qspsc\" (UID: \"7d4db17f-0c6c-45c8-bfea-5251e2f6690d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qspsc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.322146 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1656bc75-cc92-468d-b969-0fe7cb29c705-service-ca\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.322153 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ff274f4-3245-4179-b42f-7d113812766b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nwwgf\" (UID: \"0ff274f4-3245-4179-b42f-7d113812766b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwwgf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.322222 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b02d37d7-8347-4725-8964-2adad8d0e673-images\") pod \"machine-config-operator-74547568cd-spjhj\" (UID: \"b02d37d7-8347-4725-8964-2adad8d0e673\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.322316 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8171ee3e-3418-4537-b2a5-52941744cba6-srv-cert\") pod \"olm-operator-6b444d44fb-c8s9z\" (UID: \"8171ee3e-3418-4537-b2a5-52941744cba6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.322353 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/774b2ca8-d862-4e99-ab70-ca68cfc5122c-plugins-dir\") pod \"csi-hostpathplugin-7smtj\" (UID: \"774b2ca8-d862-4e99-ab70-ca68cfc5122c\") " pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.322380 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4fhrs\" (UID: \"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.322409 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlcpr\" (UniqueName: \"kubernetes.io/projected/6eb876e4-bbb0-4359-9fc7-eea081c878e1-kube-api-access-jlcpr\") pod \"multus-admission-controller-857f4d67dd-p9hfb\" (UID: \"6eb876e4-bbb0-4359-9fc7-eea081c878e1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-p9hfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.322432 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd4e6e62-b584-4d8b-b697-800317b3a3a1-cert\") pod \"ingress-canary-qnw4x\" (UID: \"cd4e6e62-b584-4d8b-b697-800317b3a3a1\") " pod="openshift-ingress-canary/ingress-canary-qnw4x" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.323024 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b02d37d7-8347-4725-8964-2adad8d0e673-images\") pod \"machine-config-operator-74547568cd-spjhj\" (UID: \"b02d37d7-8347-4725-8964-2adad8d0e673\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.323885 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f35b5e9-505a-4269-a74d-7a4523284289-serving-cert\") pod \"console-operator-58897d9998-r2ssr\" (UID: \"2f35b5e9-505a-4269-a74d-7a4523284289\") " pod="openshift-console-operator/console-operator-58897d9998-r2ssr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.324035 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.324512 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ff274f4-3245-4179-b42f-7d113812766b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nwwgf\" (UID: \"0ff274f4-3245-4179-b42f-7d113812766b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwwgf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.326020 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d4db17f-0c6c-45c8-bfea-5251e2f6690d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qspsc\" (UID: \"7d4db17f-0c6c-45c8-bfea-5251e2f6690d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qspsc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.326862 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8171ee3e-3418-4537-b2a5-52941744cba6-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c8s9z\" (UID: \"8171ee3e-3418-4537-b2a5-52941744cba6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.330410 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1656bc75-cc92-468d-b969-0fe7cb29c705-console-serving-cert\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.331484 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/98a0a088-a592-47fd-b92a-b73db5432a7b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zd5l2\" (UID: \"98a0a088-a592-47fd-b92a-b73db5432a7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.331862 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fceb997-3744-48c3-94fd-4960c139b01d-metrics-tls\") pod \"dns-operator-744455d44c-fxnfx\" (UID: \"5fceb997-3744-48c3-94fd-4960c139b01d\") " pod="openshift-dns-operator/dns-operator-744455d44c-fxnfx" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.332333 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.335935 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.336863 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5df8ee0a-f645-4f19-b75c-9cd29e21be30-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.337281 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8171ee3e-3418-4537-b2a5-52941744cba6-srv-cert\") pod \"olm-operator-6b444d44fb-c8s9z\" (UID: \"8171ee3e-3418-4537-b2a5-52941744cba6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.337889 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.340261 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eff46e39-23b9-4930-80d0-89d5d91ed2a4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-s97mj\" (UID: \"eff46e39-23b9-4930-80d0-89d5d91ed2a4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s97mj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.342106 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5df8ee0a-f645-4f19-b75c-9cd29e21be30-bound-sa-token\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.343974 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq7vc\" (UniqueName: \"kubernetes.io/projected/0ff274f4-3245-4179-b42f-7d113812766b-kube-api-access-hq7vc\") pod \"openshift-controller-manager-operator-756b6f6bc6-nwwgf\" (UID: \"0ff274f4-3245-4179-b42f-7d113812766b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwwgf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.350706 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7jtk\" (UniqueName: \"kubernetes.io/projected/98a0a088-a592-47fd-b92a-b73db5432a7b-kube-api-access-z7jtk\") pod \"cluster-image-registry-operator-dc59b4c8b-zd5l2\" (UID: \"98a0a088-a592-47fd-b92a-b73db5432a7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.351029 4867 ???:1] "http: TLS handshake error from 192.168.126.11:38520: no serving certificate available for the kubelet" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.370824 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfxxf\" (UniqueName: \"kubernetes.io/projected/5df8ee0a-f645-4f19-b75c-9cd29e21be30-kube-api-access-nfxxf\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.412589 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swmz9\" (UniqueName: \"kubernetes.io/projected/3c795069-a3a9-4cdd-994d-23b51b69f386-kube-api-access-swmz9\") pod \"control-plane-machine-set-operator-78cbb6b69f-qxlq5\" (UID: \"3c795069-a3a9-4cdd-994d-23b51b69f386\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxlq5" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423100 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423234 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xv2v\" (UniqueName: \"kubernetes.io/projected/330066b4-36c9-4414-aa1d-788a59b3ce13-kube-api-access-9xv2v\") pod \"kube-storage-version-migrator-operator-b67b599dd-z6fgw\" (UID: \"330066b4-36c9-4414-aa1d-788a59b3ce13\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6fgw" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423254 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4fhrs\" (UID: \"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423271 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e98ce093-beb3-4d6c-be7b-8afa1e7b514e-config-volume\") pod \"collect-profiles-29566080-jpqpz\" (UID: \"e98ce093-beb3-4d6c-be7b-8afa1e7b514e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423295 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4da3cc8b-1975-4899-a45a-465cd4c5170d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6z9p7\" (UID: \"4da3cc8b-1975-4899-a45a-465cd4c5170d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6z9p7" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423312 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f5af618f-1da2-4ef1-897c-6269de1ff17a-node-bootstrap-token\") pod \"machine-config-server-pr89k\" (UID: \"f5af618f-1da2-4ef1-897c-6269de1ff17a\") " pod="openshift-machine-config-operator/machine-config-server-pr89k" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423326 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/330066b4-36c9-4414-aa1d-788a59b3ce13-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-z6fgw\" (UID: \"330066b4-36c9-4414-aa1d-788a59b3ce13\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6fgw" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423343 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a2635e37-95ea-4edd-a446-2143480b4052-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-r84fp\" (UID: \"a2635e37-95ea-4edd-a446-2143480b4052\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r84fp" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423359 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm7lb\" (UniqueName: \"kubernetes.io/projected/a4b1008c-a714-49a2-8c17-e971afc302af-kube-api-access-sm7lb\") pod \"auto-csr-approver-29566090-fczg6\" (UID: \"a4b1008c-a714-49a2-8c17-e971afc302af\") " pod="openshift-infra/auto-csr-approver-29566090-fczg6" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423375 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-972ms\" (UniqueName: \"kubernetes.io/projected/a2635e37-95ea-4edd-a446-2143480b4052-kube-api-access-972ms\") pod \"machine-config-controller-84d6567774-r84fp\" (UID: \"a2635e37-95ea-4edd-a446-2143480b4052\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r84fp" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423402 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7921e429-29e3-4d4e-bddd-8b99537ca63f-client-ca\") pod \"route-controller-manager-6576b87f9c-4s8b4\" (UID: \"7921e429-29e3-4d4e-bddd-8b99537ca63f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423419 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vqk6\" (UniqueName: \"kubernetes.io/projected/2b014b7a-971f-468b-ab6d-ff31adaae80c-kube-api-access-4vqk6\") pod \"package-server-manager-789f6589d5-xd2f2\" (UID: \"2b014b7a-971f-468b-ab6d-ff31adaae80c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xd2f2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423438 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c335827-0757-4eb8-83ec-13fdfd9ee948-config\") pod \"kube-apiserver-operator-766d6c64bb-b4cbk\" (UID: \"7c335827-0757-4eb8-83ec-13fdfd9ee948\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b4cbk" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423452 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/774b2ca8-d862-4e99-ab70-ca68cfc5122c-socket-dir\") pod \"csi-hostpathplugin-7smtj\" (UID: \"774b2ca8-d862-4e99-ab70-ca68cfc5122c\") " pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423473 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bms72\" (UniqueName: \"kubernetes.io/projected/e91db560-3692-4697-bbf0-3c5a8438d5e5-kube-api-access-bms72\") pod \"auto-csr-approver-29566088-pbdgz\" (UID: \"e91db560-3692-4697-bbf0-3c5a8438d5e5\") " pod="openshift-infra/auto-csr-approver-29566088-pbdgz" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423505 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hszj\" (UniqueName: \"kubernetes.io/projected/cfaa1b05-7c04-42f5-a077-6dc6b19084b0-kube-api-access-6hszj\") pod \"dns-default-z88lc\" (UID: \"cfaa1b05-7c04-42f5-a077-6dc6b19084b0\") " pod="openshift-dns/dns-default-z88lc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423531 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7912604b-d8ba-4510-9694-c6778f3d1c8e-stats-auth\") pod \"router-default-5444994796-9n97f\" (UID: \"7912604b-d8ba-4510-9694-c6778f3d1c8e\") " pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423551 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/056f78fa-8b9c-4997-9d3a-00103d32eb09-etcd-ca\") pod \"etcd-operator-b45778765-v4wzv\" (UID: \"056f78fa-8b9c-4997-9d3a-00103d32eb09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423576 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6817894e-4517-4622-9be5-3b6b329b4d31-config\") pod \"service-ca-operator-777779d784-pvqbv\" (UID: \"6817894e-4517-4622-9be5-3b6b329b4d31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvqbv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423592 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r28kj\" (UniqueName: \"kubernetes.io/projected/cd4e6e62-b584-4d8b-b697-800317b3a3a1-kube-api-access-r28kj\") pod \"ingress-canary-qnw4x\" (UID: \"cd4e6e62-b584-4d8b-b697-800317b3a3a1\") " pod="openshift-ingress-canary/ingress-canary-qnw4x" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.424444 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e98ce093-beb3-4d6c-be7b-8afa1e7b514e-config-volume\") pod \"collect-profiles-29566080-jpqpz\" (UID: \"e98ce093-beb3-4d6c-be7b-8afa1e7b514e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.424553 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7921e429-29e3-4d4e-bddd-8b99537ca63f-client-ca\") pod \"route-controller-manager-6576b87f9c-4s8b4\" (UID: \"7921e429-29e3-4d4e-bddd-8b99537ca63f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" Mar 20 00:10:02 crc kubenswrapper[4867]: E0320 00:10:02.424633 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:02.924606362 +0000 UTC m=+217.151143979 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.424891 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/774b2ca8-d862-4e99-ab70-ca68cfc5122c-socket-dir\") pod \"csi-hostpathplugin-7smtj\" (UID: \"774b2ca8-d862-4e99-ab70-ca68cfc5122c\") " pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425167 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c335827-0757-4eb8-83ec-13fdfd9ee948-config\") pod \"kube-apiserver-operator-766d6c64bb-b4cbk\" (UID: \"7c335827-0757-4eb8-83ec-13fdfd9ee948\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b4cbk" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.423612 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cfaa1b05-7c04-42f5-a077-6dc6b19084b0-metrics-tls\") pod \"dns-default-z88lc\" (UID: \"cfaa1b05-7c04-42f5-a077-6dc6b19084b0\") " pod="openshift-dns/dns-default-z88lc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425282 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f5af618f-1da2-4ef1-897c-6269de1ff17a-certs\") pod \"machine-config-server-pr89k\" (UID: \"f5af618f-1da2-4ef1-897c-6269de1ff17a\") " pod="openshift-machine-config-operator/machine-config-server-pr89k" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425447 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/056f78fa-8b9c-4997-9d3a-00103d32eb09-etcd-ca\") pod \"etcd-operator-b45778765-v4wzv\" (UID: \"056f78fa-8b9c-4997-9d3a-00103d32eb09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425545 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6817894e-4517-4622-9be5-3b6b329b4d31-config\") pod \"service-ca-operator-777779d784-pvqbv\" (UID: \"6817894e-4517-4622-9be5-3b6b329b4d31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvqbv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425632 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b014b7a-971f-468b-ab6d-ff31adaae80c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xd2f2\" (UID: \"2b014b7a-971f-468b-ab6d-ff31adaae80c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xd2f2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425658 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2w8w\" (UniqueName: \"kubernetes.io/projected/e98ce093-beb3-4d6c-be7b-8afa1e7b514e-kube-api-access-k2w8w\") pod \"collect-profiles-29566080-jpqpz\" (UID: \"e98ce093-beb3-4d6c-be7b-8afa1e7b514e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425675 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7912604b-d8ba-4510-9694-c6778f3d1c8e-service-ca-bundle\") pod \"router-default-5444994796-9n97f\" (UID: \"7912604b-d8ba-4510-9694-c6778f3d1c8e\") " pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425699 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfaa1b05-7c04-42f5-a077-6dc6b19084b0-config-volume\") pod \"dns-default-z88lc\" (UID: \"cfaa1b05-7c04-42f5-a077-6dc6b19084b0\") " pod="openshift-dns/dns-default-z88lc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425746 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d7f6765c-8450-44ad-895b-0c6b30ed4aa3-signing-key\") pod \"service-ca-9c57cc56f-fls4v\" (UID: \"d7f6765c-8450-44ad-895b-0c6b30ed4aa3\") " pod="openshift-service-ca/service-ca-9c57cc56f-fls4v" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425770 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/056f78fa-8b9c-4997-9d3a-00103d32eb09-config\") pod \"etcd-operator-b45778765-v4wzv\" (UID: \"056f78fa-8b9c-4997-9d3a-00103d32eb09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425797 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6817894e-4517-4622-9be5-3b6b329b4d31-serving-cert\") pod \"service-ca-operator-777779d784-pvqbv\" (UID: \"6817894e-4517-4622-9be5-3b6b329b4d31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvqbv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425814 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a-trusted-ca\") pod \"ingress-operator-5b745b69d9-4v7zf\" (UID: \"0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425830 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzdk9\" (UniqueName: \"kubernetes.io/projected/056f78fa-8b9c-4997-9d3a-00103d32eb09-kube-api-access-hzdk9\") pod \"etcd-operator-b45778765-v4wzv\" (UID: \"056f78fa-8b9c-4997-9d3a-00103d32eb09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425847 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/585bf14f-1828-4820-b5b2-2aa8992b4499-tmpfs\") pod \"packageserver-d55dfcdfc-fplqr\" (UID: \"585bf14f-1828-4820-b5b2-2aa8992b4499\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425862 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/774b2ca8-d862-4e99-ab70-ca68cfc5122c-mountpoint-dir\") pod \"csi-hostpathplugin-7smtj\" (UID: \"774b2ca8-d862-4e99-ab70-ca68cfc5122c\") " pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425883 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzmth\" (UniqueName: \"kubernetes.io/projected/4cb00f5a-6c08-44bf-9f41-fd4160b07716-kube-api-access-kzmth\") pod \"catalog-operator-68c6474976-2dzk5\" (UID: \"4cb00f5a-6c08-44bf-9f41-fd4160b07716\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425897 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c335827-0757-4eb8-83ec-13fdfd9ee948-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-b4cbk\" (UID: \"7c335827-0757-4eb8-83ec-13fdfd9ee948\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b4cbk" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425914 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szdwd\" (UniqueName: \"kubernetes.io/projected/7921e429-29e3-4d4e-bddd-8b99537ca63f-kube-api-access-szdwd\") pod \"route-controller-manager-6576b87f9c-4s8b4\" (UID: \"7921e429-29e3-4d4e-bddd-8b99537ca63f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425935 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9dbm\" (UniqueName: \"kubernetes.io/projected/585bf14f-1828-4820-b5b2-2aa8992b4499-kube-api-access-j9dbm\") pod \"packageserver-d55dfcdfc-fplqr\" (UID: \"585bf14f-1828-4820-b5b2-2aa8992b4499\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425959 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5q8j\" (UniqueName: \"kubernetes.io/projected/7912604b-d8ba-4510-9694-c6778f3d1c8e-kube-api-access-c5q8j\") pod \"router-default-5444994796-9n97f\" (UID: \"7912604b-d8ba-4510-9694-c6778f3d1c8e\") " pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425977 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4cb00f5a-6c08-44bf-9f41-fd4160b07716-profile-collector-cert\") pod \"catalog-operator-68c6474976-2dzk5\" (UID: \"4cb00f5a-6c08-44bf-9f41-fd4160b07716\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425994 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzph2\" (UniqueName: \"kubernetes.io/projected/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce-kube-api-access-hzph2\") pod \"marketplace-operator-79b997595-4fhrs\" (UID: \"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426013 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2g8b\" (UniqueName: \"kubernetes.io/projected/6817894e-4517-4622-9be5-3b6b329b4d31-kube-api-access-l2g8b\") pod \"service-ca-operator-777779d784-pvqbv\" (UID: \"6817894e-4517-4622-9be5-3b6b329b4d31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvqbv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426029 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/056f78fa-8b9c-4997-9d3a-00103d32eb09-serving-cert\") pod \"etcd-operator-b45778765-v4wzv\" (UID: \"056f78fa-8b9c-4997-9d3a-00103d32eb09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426045 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e98ce093-beb3-4d6c-be7b-8afa1e7b514e-secret-volume\") pod \"collect-profiles-29566080-jpqpz\" (UID: \"e98ce093-beb3-4d6c-be7b-8afa1e7b514e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426060 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4cb00f5a-6c08-44bf-9f41-fd4160b07716-srv-cert\") pod \"catalog-operator-68c6474976-2dzk5\" (UID: \"4cb00f5a-6c08-44bf-9f41-fd4160b07716\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426075 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7912604b-d8ba-4510-9694-c6778f3d1c8e-default-certificate\") pod \"router-default-5444994796-9n97f\" (UID: \"7912604b-d8ba-4510-9694-c6778f3d1c8e\") " pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426090 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/585bf14f-1828-4820-b5b2-2aa8992b4499-webhook-cert\") pod \"packageserver-d55dfcdfc-fplqr\" (UID: \"585bf14f-1828-4820-b5b2-2aa8992b4499\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426105 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7921e429-29e3-4d4e-bddd-8b99537ca63f-config\") pod \"route-controller-manager-6576b87f9c-4s8b4\" (UID: \"7921e429-29e3-4d4e-bddd-8b99537ca63f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426119 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7921e429-29e3-4d4e-bddd-8b99537ca63f-serving-cert\") pod \"route-controller-manager-6576b87f9c-4s8b4\" (UID: \"7921e429-29e3-4d4e-bddd-8b99537ca63f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426134 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4da3cc8b-1975-4899-a45a-465cd4c5170d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6z9p7\" (UID: \"4da3cc8b-1975-4899-a45a-465cd4c5170d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6z9p7" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426151 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd4nd\" (UniqueName: \"kubernetes.io/projected/774b2ca8-d862-4e99-ab70-ca68cfc5122c-kube-api-access-bd4nd\") pod \"csi-hostpathplugin-7smtj\" (UID: \"774b2ca8-d862-4e99-ab70-ca68cfc5122c\") " pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426169 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/056f78fa-8b9c-4997-9d3a-00103d32eb09-etcd-client\") pod \"etcd-operator-b45778765-v4wzv\" (UID: \"056f78fa-8b9c-4997-9d3a-00103d32eb09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426184 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a-metrics-tls\") pod \"ingress-operator-5b745b69d9-4v7zf\" (UID: \"0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426202 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzjzz\" (UniqueName: \"kubernetes.io/projected/f5af618f-1da2-4ef1-897c-6269de1ff17a-kube-api-access-kzjzz\") pod \"machine-config-server-pr89k\" (UID: \"f5af618f-1da2-4ef1-897c-6269de1ff17a\") " pod="openshift-machine-config-operator/machine-config-server-pr89k" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426217 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d7f6765c-8450-44ad-895b-0c6b30ed4aa3-signing-cabundle\") pod \"service-ca-9c57cc56f-fls4v\" (UID: \"d7f6765c-8450-44ad-895b-0c6b30ed4aa3\") " pod="openshift-service-ca/service-ca-9c57cc56f-fls4v" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426233 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/774b2ca8-d862-4e99-ab70-ca68cfc5122c-plugins-dir\") pod \"csi-hostpathplugin-7smtj\" (UID: \"774b2ca8-d862-4e99-ab70-ca68cfc5122c\") " pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426248 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4fhrs\" (UID: \"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426265 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlcpr\" (UniqueName: \"kubernetes.io/projected/6eb876e4-bbb0-4359-9fc7-eea081c878e1-kube-api-access-jlcpr\") pod \"multus-admission-controller-857f4d67dd-p9hfb\" (UID: \"6eb876e4-bbb0-4359-9fc7-eea081c878e1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-p9hfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426281 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd4e6e62-b584-4d8b-b697-800317b3a3a1-cert\") pod \"ingress-canary-qnw4x\" (UID: \"cd4e6e62-b584-4d8b-b697-800317b3a3a1\") " pod="openshift-ingress-canary/ingress-canary-qnw4x" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426296 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/585bf14f-1828-4820-b5b2-2aa8992b4499-apiservice-cert\") pod \"packageserver-d55dfcdfc-fplqr\" (UID: \"585bf14f-1828-4820-b5b2-2aa8992b4499\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426312 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2635e37-95ea-4edd-a446-2143480b4052-proxy-tls\") pod \"machine-config-controller-84d6567774-r84fp\" (UID: \"a2635e37-95ea-4edd-a446-2143480b4052\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r84fp" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426329 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxz4c\" (UniqueName: \"kubernetes.io/projected/0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a-kube-api-access-cxz4c\") pod \"ingress-operator-5b745b69d9-4v7zf\" (UID: \"0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426347 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/330066b4-36c9-4414-aa1d-788a59b3ce13-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-z6fgw\" (UID: \"330066b4-36c9-4414-aa1d-788a59b3ce13\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6fgw" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426364 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c335827-0757-4eb8-83ec-13fdfd9ee948-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-b4cbk\" (UID: \"7c335827-0757-4eb8-83ec-13fdfd9ee948\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b4cbk" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426381 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg95v\" (UniqueName: \"kubernetes.io/projected/d7f6765c-8450-44ad-895b-0c6b30ed4aa3-kube-api-access-cg95v\") pod \"service-ca-9c57cc56f-fls4v\" (UID: \"d7f6765c-8450-44ad-895b-0c6b30ed4aa3\") " pod="openshift-service-ca/service-ca-9c57cc56f-fls4v" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426395 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7912604b-d8ba-4510-9694-c6778f3d1c8e-metrics-certs\") pod \"router-default-5444994796-9n97f\" (UID: \"7912604b-d8ba-4510-9694-c6778f3d1c8e\") " pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426414 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4v7zf\" (UID: \"0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426431 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/056f78fa-8b9c-4997-9d3a-00103d32eb09-etcd-service-ca\") pod \"etcd-operator-b45778765-v4wzv\" (UID: \"056f78fa-8b9c-4997-9d3a-00103d32eb09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426445 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4da3cc8b-1975-4899-a45a-465cd4c5170d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6z9p7\" (UID: \"4da3cc8b-1975-4899-a45a-465cd4c5170d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6z9p7" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426465 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/774b2ca8-d862-4e99-ab70-ca68cfc5122c-registration-dir\") pod \"csi-hostpathplugin-7smtj\" (UID: \"774b2ca8-d862-4e99-ab70-ca68cfc5122c\") " pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426480 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/774b2ca8-d862-4e99-ab70-ca68cfc5122c-csi-data-dir\") pod \"csi-hostpathplugin-7smtj\" (UID: \"774b2ca8-d862-4e99-ab70-ca68cfc5122c\") " pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426529 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6eb876e4-bbb0-4359-9fc7-eea081c878e1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-p9hfb\" (UID: \"6eb876e4-bbb0-4359-9fc7-eea081c878e1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-p9hfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426587 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7912604b-d8ba-4510-9694-c6778f3d1c8e-service-ca-bundle\") pod \"router-default-5444994796-9n97f\" (UID: \"7912604b-d8ba-4510-9694-c6778f3d1c8e\") " pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.426837 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4fhrs\" (UID: \"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.427126 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfaa1b05-7c04-42f5-a077-6dc6b19084b0-config-volume\") pod \"dns-default-z88lc\" (UID: \"cfaa1b05-7c04-42f5-a077-6dc6b19084b0\") " pod="openshift-dns/dns-default-z88lc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.428117 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7912604b-d8ba-4510-9694-c6778f3d1c8e-stats-auth\") pod \"router-default-5444994796-9n97f\" (UID: \"7912604b-d8ba-4510-9694-c6778f3d1c8e\") " pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.428921 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/585bf14f-1828-4820-b5b2-2aa8992b4499-tmpfs\") pod \"packageserver-d55dfcdfc-fplqr\" (UID: \"585bf14f-1828-4820-b5b2-2aa8992b4499\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.429406 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/330066b4-36c9-4414-aa1d-788a59b3ce13-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-z6fgw\" (UID: \"330066b4-36c9-4414-aa1d-788a59b3ce13\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6fgw" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.430011 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/774b2ca8-d862-4e99-ab70-ca68cfc5122c-mountpoint-dir\") pod \"csi-hostpathplugin-7smtj\" (UID: \"774b2ca8-d862-4e99-ab70-ca68cfc5122c\") " pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.430139 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/056f78fa-8b9c-4997-9d3a-00103d32eb09-config\") pod \"etcd-operator-b45778765-v4wzv\" (UID: \"056f78fa-8b9c-4997-9d3a-00103d32eb09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.432026 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/056f78fa-8b9c-4997-9d3a-00103d32eb09-etcd-service-ca\") pod \"etcd-operator-b45778765-v4wzv\" (UID: \"056f78fa-8b9c-4997-9d3a-00103d32eb09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.432940 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f5af618f-1da2-4ef1-897c-6269de1ff17a-node-bootstrap-token\") pod \"machine-config-server-pr89k\" (UID: \"f5af618f-1da2-4ef1-897c-6269de1ff17a\") " pod="openshift-machine-config-operator/machine-config-server-pr89k" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.432942 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c335827-0757-4eb8-83ec-13fdfd9ee948-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-b4cbk\" (UID: \"7c335827-0757-4eb8-83ec-13fdfd9ee948\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b4cbk" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.425651 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a2635e37-95ea-4edd-a446-2143480b4052-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-r84fp\" (UID: \"a2635e37-95ea-4edd-a446-2143480b4052\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r84fp" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.433050 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4da3cc8b-1975-4899-a45a-465cd4c5170d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6z9p7\" (UID: \"4da3cc8b-1975-4899-a45a-465cd4c5170d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6z9p7" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.433113 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/774b2ca8-d862-4e99-ab70-ca68cfc5122c-registration-dir\") pod \"csi-hostpathplugin-7smtj\" (UID: \"774b2ca8-d862-4e99-ab70-ca68cfc5122c\") " pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.433179 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/774b2ca8-d862-4e99-ab70-ca68cfc5122c-csi-data-dir\") pod \"csi-hostpathplugin-7smtj\" (UID: \"774b2ca8-d862-4e99-ab70-ca68cfc5122c\") " pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.433461 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4cb00f5a-6c08-44bf-9f41-fd4160b07716-srv-cert\") pod \"catalog-operator-68c6474976-2dzk5\" (UID: \"4cb00f5a-6c08-44bf-9f41-fd4160b07716\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.433471 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/330066b4-36c9-4414-aa1d-788a59b3ce13-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-z6fgw\" (UID: \"330066b4-36c9-4414-aa1d-788a59b3ce13\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6fgw" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.433576 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6eb876e4-bbb0-4359-9fc7-eea081c878e1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-p9hfb\" (UID: \"6eb876e4-bbb0-4359-9fc7-eea081c878e1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-p9hfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.433601 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cfaa1b05-7c04-42f5-a077-6dc6b19084b0-metrics-tls\") pod \"dns-default-z88lc\" (UID: \"cfaa1b05-7c04-42f5-a077-6dc6b19084b0\") " pod="openshift-dns/dns-default-z88lc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.433676 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/774b2ca8-d862-4e99-ab70-ca68cfc5122c-plugins-dir\") pod \"csi-hostpathplugin-7smtj\" (UID: \"774b2ca8-d862-4e99-ab70-ca68cfc5122c\") " pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.434702 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7921e429-29e3-4d4e-bddd-8b99537ca63f-config\") pod \"route-controller-manager-6576b87f9c-4s8b4\" (UID: \"7921e429-29e3-4d4e-bddd-8b99537ca63f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.435278 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d7f6765c-8450-44ad-895b-0c6b30ed4aa3-signing-key\") pod \"service-ca-9c57cc56f-fls4v\" (UID: \"d7f6765c-8450-44ad-895b-0c6b30ed4aa3\") " pod="openshift-service-ca/service-ca-9c57cc56f-fls4v" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.435359 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f5af618f-1da2-4ef1-897c-6269de1ff17a-certs\") pod \"machine-config-server-pr89k\" (UID: \"f5af618f-1da2-4ef1-897c-6269de1ff17a\") " pod="openshift-machine-config-operator/machine-config-server-pr89k" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.436009 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a-trusted-ca\") pod \"ingress-operator-5b745b69d9-4v7zf\" (UID: \"0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.436115 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e98ce093-beb3-4d6c-be7b-8afa1e7b514e-secret-volume\") pod \"collect-profiles-29566080-jpqpz\" (UID: \"e98ce093-beb3-4d6c-be7b-8afa1e7b514e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.436675 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7921e429-29e3-4d4e-bddd-8b99537ca63f-serving-cert\") pod \"route-controller-manager-6576b87f9c-4s8b4\" (UID: \"7921e429-29e3-4d4e-bddd-8b99537ca63f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.437248 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7912604b-d8ba-4510-9694-c6778f3d1c8e-metrics-certs\") pod \"router-default-5444994796-9n97f\" (UID: \"7912604b-d8ba-4510-9694-c6778f3d1c8e\") " pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.437663 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d7f6765c-8450-44ad-895b-0c6b30ed4aa3-signing-cabundle\") pod \"service-ca-9c57cc56f-fls4v\" (UID: \"d7f6765c-8450-44ad-895b-0c6b30ed4aa3\") " pod="openshift-service-ca/service-ca-9c57cc56f-fls4v" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.437807 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6817894e-4517-4622-9be5-3b6b329b4d31-serving-cert\") pod \"service-ca-operator-777779d784-pvqbv\" (UID: \"6817894e-4517-4622-9be5-3b6b329b4d31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvqbv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.437815 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/056f78fa-8b9c-4997-9d3a-00103d32eb09-etcd-client\") pod \"etcd-operator-b45778765-v4wzv\" (UID: \"056f78fa-8b9c-4997-9d3a-00103d32eb09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.438400 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/585bf14f-1828-4820-b5b2-2aa8992b4499-webhook-cert\") pod \"packageserver-d55dfcdfc-fplqr\" (UID: \"585bf14f-1828-4820-b5b2-2aa8992b4499\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.438595 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7912604b-d8ba-4510-9694-c6778f3d1c8e-default-certificate\") pod \"router-default-5444994796-9n97f\" (UID: \"7912604b-d8ba-4510-9694-c6778f3d1c8e\") " pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.438877 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/056f78fa-8b9c-4997-9d3a-00103d32eb09-serving-cert\") pod \"etcd-operator-b45778765-v4wzv\" (UID: \"056f78fa-8b9c-4997-9d3a-00103d32eb09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.439368 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd4e6e62-b584-4d8b-b697-800317b3a3a1-cert\") pod \"ingress-canary-qnw4x\" (UID: \"cd4e6e62-b584-4d8b-b697-800317b3a3a1\") " pod="openshift-ingress-canary/ingress-canary-qnw4x" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.439669 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4fhrs\" (UID: \"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.439831 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b014b7a-971f-468b-ab6d-ff31adaae80c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xd2f2\" (UID: \"2b014b7a-971f-468b-ab6d-ff31adaae80c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xd2f2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.440572 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2635e37-95ea-4edd-a446-2143480b4052-proxy-tls\") pod \"machine-config-controller-84d6567774-r84fp\" (UID: \"a2635e37-95ea-4edd-a446-2143480b4052\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r84fp" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.441059 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/585bf14f-1828-4820-b5b2-2aa8992b4499-apiservice-cert\") pod \"packageserver-d55dfcdfc-fplqr\" (UID: \"585bf14f-1828-4820-b5b2-2aa8992b4499\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.441532 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a-metrics-tls\") pod \"ingress-operator-5b745b69d9-4v7zf\" (UID: \"0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.441720 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4cb00f5a-6c08-44bf-9f41-fd4160b07716-profile-collector-cert\") pod \"catalog-operator-68c6474976-2dzk5\" (UID: \"4cb00f5a-6c08-44bf-9f41-fd4160b07716\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.446826 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4da3cc8b-1975-4899-a45a-465cd4c5170d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6z9p7\" (UID: \"4da3cc8b-1975-4899-a45a-465cd4c5170d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6z9p7" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.450953 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq2wf\" (UniqueName: \"kubernetes.io/projected/eff46e39-23b9-4930-80d0-89d5d91ed2a4-kube-api-access-zq2wf\") pod \"openshift-apiserver-operator-796bbdcf4f-s97mj\" (UID: \"eff46e39-23b9-4930-80d0-89d5d91ed2a4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s97mj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.456288 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2vn8\" (UniqueName: \"kubernetes.io/projected/b02d37d7-8347-4725-8964-2adad8d0e673-kube-api-access-b2vn8\") pod \"machine-config-operator-74547568cd-spjhj\" (UID: \"b02d37d7-8347-4725-8964-2adad8d0e673\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.475191 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zc95\" (UniqueName: \"kubernetes.io/projected/a6b937c9-ef34-4082-9daf-58994e3a7272-kube-api-access-2zc95\") pod \"openshift-config-operator-7777fb866f-zvttc\" (UID: \"a6b937c9-ef34-4082-9daf-58994e3a7272\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zvttc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.493182 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5m2b\" (UniqueName: \"kubernetes.io/projected/1656bc75-cc92-468d-b969-0fe7cb29c705-kube-api-access-j5m2b\") pod \"console-f9d7485db-mzkng\" (UID: \"1656bc75-cc92-468d-b969-0fe7cb29c705\") " pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.502695 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwwgf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.512916 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zlcr\" (UniqueName: \"kubernetes.io/projected/6edfa360-6c55-4e5a-a26c-a78eacd4a143-kube-api-access-8zlcr\") pod \"cluster-samples-operator-665b6dd947-4wmcx\" (UID: \"6edfa360-6c55-4e5a-a26c-a78eacd4a143\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4wmcx" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.513368 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxlq5" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.520463 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4wmcx" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.526192 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zvttc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.527541 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: E0320 00:10:02.528117 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:03.028102547 +0000 UTC m=+217.254640064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.534882 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dcsh\" (UniqueName: \"kubernetes.io/projected/fb1e4d21-3692-4e9d-914c-12cd30f641fa-kube-api-access-2dcsh\") pod \"oauth-openshift-558db77b4-bwpfb\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.543728 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.551882 4867 ???:1] "http: TLS handshake error from 192.168.126.11:38522: no serving certificate available for the kubelet" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.584040 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ltv5\" (UniqueName: \"kubernetes.io/projected/5fceb997-3744-48c3-94fd-4960c139b01d-kube-api-access-2ltv5\") pod \"dns-operator-744455d44c-fxnfx\" (UID: \"5fceb997-3744-48c3-94fd-4960c139b01d\") " pod="openshift-dns-operator/dns-operator-744455d44c-fxnfx" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.593146 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-679w4\" (UniqueName: \"kubernetes.io/projected/7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7-kube-api-access-679w4\") pod \"apiserver-7bbb656c7d-f98fd\" (UID: \"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.611730 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s97mj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.612426 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw62n\" (UniqueName: \"kubernetes.io/projected/c74119c8-4cf1-4c7b-9dba-49421dfe52e6-kube-api-access-zw62n\") pod \"image-pruner-29566080-vq54z\" (UID: \"c74119c8-4cf1-4c7b-9dba-49421dfe52e6\") " pod="openshift-image-registry/image-pruner-29566080-vq54z" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.624859 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.628402 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:02 crc kubenswrapper[4867]: E0320 00:10:02.628813 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:03.128797431 +0000 UTC m=+217.355334948 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.637674 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpfdq\" (UniqueName: \"kubernetes.io/projected/42544638-2b4a-4acd-bd73-e1fc16a3c921-kube-api-access-xpfdq\") pod \"migrator-59844c95c7-ghlsv\" (UID: \"42544638-2b4a-4acd-bd73-e1fc16a3c921\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ghlsv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.647329 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d4db17f-0c6c-45c8-bfea-5251e2f6690d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qspsc\" (UID: \"7d4db17f-0c6c-45c8-bfea-5251e2f6690d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qspsc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.666901 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7d2h\" (UniqueName: \"kubernetes.io/projected/3283192e-4886-4409-8a9e-dda503c25e85-kube-api-access-c7d2h\") pod \"downloads-7954f5f757-tp4c2\" (UID: \"3283192e-4886-4409-8a9e-dda503c25e85\") " pod="openshift-console/downloads-7954f5f757-tp4c2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.689107 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98a0a088-a592-47fd-b92a-b73db5432a7b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zd5l2\" (UID: \"98a0a088-a592-47fd-b92a-b73db5432a7b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.698775 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpkfj\" (UniqueName: \"kubernetes.io/projected/2f35b5e9-505a-4269-a74d-7a4523284289-kube-api-access-xpkfj\") pod \"console-operator-58897d9998-r2ssr\" (UID: \"2f35b5e9-505a-4269-a74d-7a4523284289\") " pod="openshift-console-operator/console-operator-58897d9998-r2ssr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.721078 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4gql\" (UniqueName: \"kubernetes.io/projected/8171ee3e-3418-4537-b2a5-52941744cba6-kube-api-access-f4gql\") pod \"olm-operator-6b444d44fb-c8s9z\" (UID: \"8171ee3e-3418-4537-b2a5-52941744cba6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.730229 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: E0320 00:10:02.730610 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:03.230595063 +0000 UTC m=+217.457132580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.749959 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.768451 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tp4c2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.772070 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm7lb\" (UniqueName: \"kubernetes.io/projected/a4b1008c-a714-49a2-8c17-e971afc302af-kube-api-access-sm7lb\") pod \"auto-csr-approver-29566090-fczg6\" (UID: \"a4b1008c-a714-49a2-8c17-e971afc302af\") " pod="openshift-infra/auto-csr-approver-29566090-fczg6" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.777752 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bms72\" (UniqueName: \"kubernetes.io/projected/e91db560-3692-4697-bbf0-3c5a8438d5e5-kube-api-access-bms72\") pod \"auto-csr-approver-29566088-pbdgz\" (UID: \"e91db560-3692-4697-bbf0-3c5a8438d5e5\") " pod="openshift-infra/auto-csr-approver-29566088-pbdgz" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.790332 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566088-pbdgz" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.796166 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-972ms\" (UniqueName: \"kubernetes.io/projected/a2635e37-95ea-4edd-a446-2143480b4052-kube-api-access-972ms\") pod \"machine-config-controller-84d6567774-r84fp\" (UID: \"a2635e37-95ea-4edd-a446-2143480b4052\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r84fp" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.815096 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xv2v\" (UniqueName: \"kubernetes.io/projected/330066b4-36c9-4414-aa1d-788a59b3ce13-kube-api-access-9xv2v\") pod \"kube-storage-version-migrator-operator-b67b599dd-z6fgw\" (UID: \"330066b4-36c9-4414-aa1d-788a59b3ce13\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6fgw" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.831200 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:02 crc kubenswrapper[4867]: E0320 00:10:02.831376 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:03.331347518 +0000 UTC m=+217.557885035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.831453 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:02 crc kubenswrapper[4867]: E0320 00:10:02.831762 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:03.331744408 +0000 UTC m=+217.558281925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.834023 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29566080-vq54z" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.844872 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566090-fczg6" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.847638 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vqk6\" (UniqueName: \"kubernetes.io/projected/2b014b7a-971f-468b-ab6d-ff31adaae80c-kube-api-access-4vqk6\") pod \"package-server-manager-789f6589d5-xd2f2\" (UID: \"2b014b7a-971f-468b-ab6d-ff31adaae80c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xd2f2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.847882 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.855357 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.871518 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-r2ssr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.871769 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4da3cc8b-1975-4899-a45a-465cd4c5170d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6z9p7\" (UID: \"4da3cc8b-1975-4899-a45a-465cd4c5170d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6z9p7" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.876543 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r28kj\" (UniqueName: \"kubernetes.io/projected/cd4e6e62-b584-4d8b-b697-800317b3a3a1-kube-api-access-r28kj\") pod \"ingress-canary-qnw4x\" (UID: \"cd4e6e62-b584-4d8b-b697-800317b3a3a1\") " pod="openshift-ingress-canary/ingress-canary-qnw4x" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.878766 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-fxnfx" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.884533 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qspsc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.907930 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hszj\" (UniqueName: \"kubernetes.io/projected/cfaa1b05-7c04-42f5-a077-6dc6b19084b0-kube-api-access-6hszj\") pod \"dns-default-z88lc\" (UID: \"cfaa1b05-7c04-42f5-a077-6dc6b19084b0\") " pod="openshift-dns/dns-default-z88lc" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.908210 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.908867 4867 ???:1] "http: TLS handshake error from 192.168.126.11:38524: no serving certificate available for the kubelet" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.929961 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ghlsv" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.930999 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2w8w\" (UniqueName: \"kubernetes.io/projected/e98ce093-beb3-4d6c-be7b-8afa1e7b514e-kube-api-access-k2w8w\") pod \"collect-profiles-29566080-jpqpz\" (UID: \"e98ce093-beb3-4d6c-be7b-8afa1e7b514e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.933290 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:02 crc kubenswrapper[4867]: E0320 00:10:02.933688 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:03.433670414 +0000 UTC m=+217.660207931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.935531 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9dbm\" (UniqueName: \"kubernetes.io/projected/585bf14f-1828-4820-b5b2-2aa8992b4499-kube-api-access-j9dbm\") pod \"packageserver-d55dfcdfc-fplqr\" (UID: \"585bf14f-1828-4820-b5b2-2aa8992b4499\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.945145 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r84fp" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.952398 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6z9p7" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.959186 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4v7zf\" (UID: \"0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.981692 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxlq5"] Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.988004 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz" Mar 20 00:10:02 crc kubenswrapper[4867]: I0320 00:10:02.997769 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c335827-0757-4eb8-83ec-13fdfd9ee948-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-b4cbk\" (UID: \"7c335827-0757-4eb8-83ec-13fdfd9ee948\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b4cbk" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.006965 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" event={"ID":"c75b5610-6996-4d26-b251-04176399fd0b","Type":"ContainerStarted","Data":"8d8f3d925b7625df443952ffc5aad44e467de5d9920f0e74688dc11cff973e95"} Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.007040 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" event={"ID":"c75b5610-6996-4d26-b251-04176399fd0b","Type":"ContainerStarted","Data":"0cf8b8e4ef771e9a9203650ca6a4e6dff1bf9bef6702d27988c8aea3608064a7"} Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.009431 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.011639 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.018174 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzmth\" (UniqueName: \"kubernetes.io/projected/4cb00f5a-6c08-44bf-9f41-fd4160b07716-kube-api-access-kzmth\") pod \"catalog-operator-68c6474976-2dzk5\" (UID: \"4cb00f5a-6c08-44bf-9f41-fd4160b07716\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.025822 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6fgw" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.032258 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xd2f2" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.032337 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg95v\" (UniqueName: \"kubernetes.io/projected/d7f6765c-8450-44ad-895b-0c6b30ed4aa3-kube-api-access-cg95v\") pod \"service-ca-9c57cc56f-fls4v\" (UID: \"d7f6765c-8450-44ad-895b-0c6b30ed4aa3\") " pod="openshift-service-ca/service-ca-9c57cc56f-fls4v" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.034618 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:03 crc kubenswrapper[4867]: E0320 00:10:03.034999 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:03.534985514 +0000 UTC m=+217.761523031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.049395 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-fls4v" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.053295 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxz4c\" (UniqueName: \"kubernetes.io/projected/0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a-kube-api-access-cxz4c\") pod \"ingress-operator-5b745b69d9-4v7zf\" (UID: \"0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.054749 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.073994 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd4nd\" (UniqueName: \"kubernetes.io/projected/774b2ca8-d862-4e99-ab70-ca68cfc5122c-kube-api-access-bd4nd\") pod \"csi-hostpathplugin-7smtj\" (UID: \"774b2ca8-d862-4e99-ab70-ca68cfc5122c\") " pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.081029 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szdwd\" (UniqueName: \"kubernetes.io/projected/7921e429-29e3-4d4e-bddd-8b99537ca63f-kube-api-access-szdwd\") pod \"route-controller-manager-6576b87f9c-4s8b4\" (UID: \"7921e429-29e3-4d4e-bddd-8b99537ca63f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.093774 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5q8j\" (UniqueName: \"kubernetes.io/projected/7912604b-d8ba-4510-9694-c6778f3d1c8e-kube-api-access-c5q8j\") pod \"router-default-5444994796-9n97f\" (UID: \"7912604b-d8ba-4510-9694-c6778f3d1c8e\") " pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.096282 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z88lc" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.103441 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qnw4x" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.113763 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7smtj" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.128151 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzjzz\" (UniqueName: \"kubernetes.io/projected/f5af618f-1da2-4ef1-897c-6269de1ff17a-kube-api-access-kzjzz\") pod \"machine-config-server-pr89k\" (UID: \"f5af618f-1da2-4ef1-897c-6269de1ff17a\") " pod="openshift-machine-config-operator/machine-config-server-pr89k" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.150903 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pr89k" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.151071 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:03 crc kubenswrapper[4867]: E0320 00:10:03.151316 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:03.651289902 +0000 UTC m=+217.877827419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.153725 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzdk9\" (UniqueName: \"kubernetes.io/projected/056f78fa-8b9c-4997-9d3a-00103d32eb09-kube-api-access-hzdk9\") pod \"etcd-operator-b45778765-v4wzv\" (UID: \"056f78fa-8b9c-4997-9d3a-00103d32eb09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.175940 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2g8b\" (UniqueName: \"kubernetes.io/projected/6817894e-4517-4622-9be5-3b6b329b4d31-kube-api-access-l2g8b\") pod \"service-ca-operator-777779d784-pvqbv\" (UID: \"6817894e-4517-4622-9be5-3b6b329b4d31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvqbv" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.180238 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzph2\" (UniqueName: \"kubernetes.io/projected/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce-kube-api-access-hzph2\") pod \"marketplace-operator-79b997595-4fhrs\" (UID: \"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.200194 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlcpr\" (UniqueName: \"kubernetes.io/projected/6eb876e4-bbb0-4359-9fc7-eea081c878e1-kube-api-access-jlcpr\") pod \"multus-admission-controller-857f4d67dd-p9hfb\" (UID: \"6eb876e4-bbb0-4359-9fc7-eea081c878e1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-p9hfb" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.248929 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b4cbk" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.254579 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:03 crc kubenswrapper[4867]: E0320 00:10:03.254946 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:03.754933731 +0000 UTC m=+217.981471248 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.259662 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.268184 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.279148 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.297349 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.316277 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.344230 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvqbv" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.358336 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:03 crc kubenswrapper[4867]: E0320 00:10:03.358901 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:03.858885547 +0000 UTC m=+218.085423064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.401227 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-p9hfb" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.459794 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:03 crc kubenswrapper[4867]: E0320 00:10:03.460999 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:03.960971616 +0000 UTC m=+218.187509133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.561924 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:03 crc kubenswrapper[4867]: E0320 00:10:03.562293 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:04.062275356 +0000 UTC m=+218.288812873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.588832 4867 ???:1] "http: TLS handshake error from 192.168.126.11:38540: no serving certificate available for the kubelet" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.664397 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:03 crc kubenswrapper[4867]: E0320 00:10:03.665031 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:04.165018182 +0000 UTC m=+218.391555699 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.712252 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwwgf"] Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.742868 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kl9jq" podStartSLOduration=167.742848288 podStartE2EDuration="2m47.742848288s" podCreationTimestamp="2026-03-20 00:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:03.739842142 +0000 UTC m=+217.966379659" watchObservedRunningTime="2026-03-20 00:10:03.742848288 +0000 UTC m=+217.969385805" Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.766950 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:03 crc kubenswrapper[4867]: E0320 00:10:03.767281 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:04.267266375 +0000 UTC m=+218.493803882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.804107 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4wmcx"] Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.828800 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s97mj"] Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.870152 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:03 crc kubenswrapper[4867]: E0320 00:10:03.870452 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:04.370439192 +0000 UTC m=+218.596976709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:03 crc kubenswrapper[4867]: I0320 00:10:03.970672 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:03 crc kubenswrapper[4867]: E0320 00:10:03.970919 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:04.47090464 +0000 UTC m=+218.697442157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.012286 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-n22zb" podStartSLOduration=167.012269625 podStartE2EDuration="2m47.012269625s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:04.005810232 +0000 UTC m=+218.232347749" watchObservedRunningTime="2026-03-20 00:10:04.012269625 +0000 UTC m=+218.238807142" Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.014704 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxlq5" event={"ID":"3c795069-a3a9-4cdd-994d-23b51b69f386","Type":"ContainerStarted","Data":"25a81581bde4b08328e856b9e8cad9add1f779bca840e7f4e70bbbb6b50f0171"} Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.014753 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxlq5" event={"ID":"3c795069-a3a9-4cdd-994d-23b51b69f386","Type":"ContainerStarted","Data":"345110c5597bc312d5109a52cf54e05f0a0abe79956eebe61fb8beb48c488931"} Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.020243 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pr89k" event={"ID":"f5af618f-1da2-4ef1-897c-6269de1ff17a","Type":"ContainerStarted","Data":"e12c6d835185f1bbb8fb6d053e74ef7170f37aea75e0a6f1e2895c40e121ad78"} Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.020282 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pr89k" event={"ID":"f5af618f-1da2-4ef1-897c-6269de1ff17a","Type":"ContainerStarted","Data":"ab88267460a847b4dd2b70a51f0ebfe3d8e874fb9696afd47a2319f9b9dd7b0e"} Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.029432 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9n97f" event={"ID":"7912604b-d8ba-4510-9694-c6778f3d1c8e","Type":"ContainerStarted","Data":"0fda5a16c5c5f91b226c8addc9346e89055d19c3cb17171b29a33e6a264c19e2"} Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.029474 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9n97f" event={"ID":"7912604b-d8ba-4510-9694-c6778f3d1c8e","Type":"ContainerStarted","Data":"9994b205fb9afa9cbc5b17de5b1e134f7bf0cdc8c9e210202df9f3492a57e75c"} Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.050672 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-xslxh" podStartSLOduration=168.050657215 podStartE2EDuration="2m48.050657215s" podCreationTimestamp="2026-03-20 00:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:04.049847025 +0000 UTC m=+218.276384552" watchObservedRunningTime="2026-03-20 00:10:04.050657215 +0000 UTC m=+218.277194732" Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.072359 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:04 crc kubenswrapper[4867]: E0320 00:10:04.072680 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:04.572668311 +0000 UTC m=+218.799205818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.173659 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:04 crc kubenswrapper[4867]: E0320 00:10:04.173860 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:04.673835907 +0000 UTC m=+218.900373424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.173932 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:04 crc kubenswrapper[4867]: E0320 00:10:04.174701 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:04.674685299 +0000 UTC m=+218.901222816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.270800 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.276032 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:04 crc kubenswrapper[4867]: E0320 00:10:04.276360 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:04.776332747 +0000 UTC m=+219.002870264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.335577 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" podStartSLOduration=167.33423337 podStartE2EDuration="2m47.33423337s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:04.331672575 +0000 UTC m=+218.558210092" watchObservedRunningTime="2026-03-20 00:10:04.33423337 +0000 UTC m=+218.560770887" Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.379194 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:04 crc kubenswrapper[4867]: E0320 00:10:04.379485 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:04.879474113 +0000 UTC m=+219.106011630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.480367 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:04 crc kubenswrapper[4867]: E0320 00:10:04.480592 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:04.980571607 +0000 UTC m=+219.207109124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.480645 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:04 crc kubenswrapper[4867]: E0320 00:10:04.480915 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:04.980907696 +0000 UTC m=+219.207445213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.584432 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:04 crc kubenswrapper[4867]: E0320 00:10:04.584592 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:05.084570685 +0000 UTC m=+219.311108202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.584702 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:04 crc kubenswrapper[4867]: E0320 00:10:04.585012 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:05.084998306 +0000 UTC m=+219.311535823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.624722 4867 patch_prober.go:28] interesting pod/router-default-5444994796-9n97f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 00:10:04 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Mar 20 00:10:04 crc kubenswrapper[4867]: [+]process-running ok Mar 20 00:10:04 crc kubenswrapper[4867]: healthz check failed Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.624964 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9n97f" podUID="7912604b-d8ba-4510-9694-c6778f3d1c8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.688173 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:04 crc kubenswrapper[4867]: E0320 00:10:04.688510 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:05.1884833 +0000 UTC m=+219.415020817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.792160 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:04 crc kubenswrapper[4867]: E0320 00:10:04.792457 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:05.292446387 +0000 UTC m=+219.518983904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.823140 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" podStartSLOduration=168.823125482 podStartE2EDuration="2m48.823125482s" podCreationTimestamp="2026-03-20 00:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:04.821123032 +0000 UTC m=+219.047660549" watchObservedRunningTime="2026-03-20 00:10:04.823125482 +0000 UTC m=+219.049662999" Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.829987 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zvttc"] Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.893727 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:04 crc kubenswrapper[4867]: E0320 00:10:04.894443 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:05.394415693 +0000 UTC m=+219.620953210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.903319 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj"] Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.903612 4867 ???:1] "http: TLS handshake error from 192.168.126.11:60958: no serving certificate available for the kubelet" Mar 20 00:10:04 crc kubenswrapper[4867]: I0320 00:10:04.997708 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:04 crc kubenswrapper[4867]: E0320 00:10:04.997990 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:05.49797719 +0000 UTC m=+219.724514707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.013143 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-pr89k" podStartSLOduration=5.013126943 podStartE2EDuration="5.013126943s" podCreationTimestamp="2026-03-20 00:10:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:04.975178214 +0000 UTC m=+219.201715731" watchObservedRunningTime="2026-03-20 00:10:05.013126943 +0000 UTC m=+219.239664450" Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.016401 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-9n97f" podStartSLOduration=168.016391745 podStartE2EDuration="2m48.016391745s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:05.01224091 +0000 UTC m=+219.238778427" watchObservedRunningTime="2026-03-20 00:10:05.016391745 +0000 UTC m=+219.242929262" Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.033668 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zvttc" event={"ID":"a6b937c9-ef34-4082-9daf-58994e3a7272","Type":"ContainerStarted","Data":"ac737fadff4b7b9f3cd5b2c2534e5400591c67727b141c3e1ddf6ec329455713"} Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.033738 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zvttc" event={"ID":"a6b937c9-ef34-4082-9daf-58994e3a7272","Type":"ContainerStarted","Data":"1e8fa81e68050b41722ccd053c0b10537308a28ba957beb9c153972e7dc8687a"} Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.034905 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwwgf" event={"ID":"0ff274f4-3245-4179-b42f-7d113812766b","Type":"ContainerStarted","Data":"0e997bbf108e39f679cd02b53192faada1c8e5d7aa80a996106a91312d584b5a"} Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.034947 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwwgf" event={"ID":"0ff274f4-3245-4179-b42f-7d113812766b","Type":"ContainerStarted","Data":"6f84d36642fc06116c56be827e25852430ccb18b679c8eda0777f63c333740ed"} Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.035971 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj" event={"ID":"b02d37d7-8347-4725-8964-2adad8d0e673","Type":"ContainerStarted","Data":"270163761ce0765f4e3b587f93ba8190da509ed981f7df69dac706d79ae26168"} Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.042540 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s97mj" event={"ID":"eff46e39-23b9-4930-80d0-89d5d91ed2a4","Type":"ContainerStarted","Data":"f7025a765ea14ecd9ce264ddb60b620faaa9d0f5ce0f6ccd006a5bc152bb9480"} Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.042913 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s97mj" event={"ID":"eff46e39-23b9-4930-80d0-89d5d91ed2a4","Type":"ContainerStarted","Data":"eaea0a875e101dd86339182283a461d6a32e961d722330a74f8155691d0d220d"} Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.045111 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4wmcx" event={"ID":"6edfa360-6c55-4e5a-a26c-a78eacd4a143","Type":"ContainerStarted","Data":"79ced92e0956081f41a74ed6c0a68aaa2c2e8f49a6c3db6d533b589198a0f026"} Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.045140 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4wmcx" event={"ID":"6edfa360-6c55-4e5a-a26c-a78eacd4a143","Type":"ContainerStarted","Data":"72a13f75acbd9a42cc7e4c7857bd7fd0aeb59bc825c32d41693ae2ea021286f3"} Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.045149 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4wmcx" event={"ID":"6edfa360-6c55-4e5a-a26c-a78eacd4a143","Type":"ContainerStarted","Data":"3c120c5a423f9f44c7a6a560bbea85d06f26d60277b0310d5269e10163d1c64c"} Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.048026 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qxlq5" podStartSLOduration=168.048014454 podStartE2EDuration="2m48.048014454s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:05.046863355 +0000 UTC m=+219.273400872" watchObservedRunningTime="2026-03-20 00:10:05.048014454 +0000 UTC m=+219.274551971" Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.074790 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-mzkng"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.076934 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566090-fczg6"] Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.095927 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1656bc75_cc92_468d_b969_0fe7cb29c705.slice/crio-e79323a14bebf1809d3b4d6c038f1188ded736463a0dd3349a52863ad36e40a8 WatchSource:0}: Error finding container e79323a14bebf1809d3b4d6c038f1188ded736463a0dd3349a52863ad36e40a8: Status 404 returned error can't find the container with id e79323a14bebf1809d3b4d6c038f1188ded736463a0dd3349a52863ad36e40a8 Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.099259 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nwwgf" podStartSLOduration=168.099247868 podStartE2EDuration="2m48.099247868s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:05.096099728 +0000 UTC m=+219.322637245" watchObservedRunningTime="2026-03-20 00:10:05.099247868 +0000 UTC m=+219.325785385" Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.099934 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.100123 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.100444 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-fls4v"] Mar 20 00:10:05 crc kubenswrapper[4867]: E0320 00:10:05.100663 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:05.600650563 +0000 UTC m=+219.827188080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.100700 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:05 crc kubenswrapper[4867]: E0320 00:10:05.100928 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:05.60092182 +0000 UTC m=+219.827459337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.108137 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29566080-vq54z"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.115005 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z88lc"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.119851 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.127334 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6z9p7"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.127381 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r2ssr"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.133056 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s97mj" podStartSLOduration=169.133043131 podStartE2EDuration="2m49.133043131s" podCreationTimestamp="2026-03-20 00:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:05.131725388 +0000 UTC m=+219.358262905" watchObservedRunningTime="2026-03-20 00:10:05.133043131 +0000 UTC m=+219.359580648" Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.190461 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfaa1b05_7c04_42f5_a077_6dc6b19084b0.slice/crio-2b55255a27698d028748c08701d510e111953d45d6a377a33487e38d6168e846 WatchSource:0}: Error finding container 2b55255a27698d028748c08701d510e111953d45d6a377a33487e38d6168e846: Status 404 returned error can't find the container with id 2b55255a27698d028748c08701d510e111953d45d6a377a33487e38d6168e846 Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.191524 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f35b5e9_505a_4269_a74d_7a4523284289.slice/crio-58fe0de5f9410cca895d833acb7739ca84e4b50d9126115b6fb33954fee94a3b WatchSource:0}: Error finding container 58fe0de5f9410cca895d833acb7739ca84e4b50d9126115b6fb33954fee94a3b: Status 404 returned error can't find the container with id 58fe0de5f9410cca895d833acb7739ca84e4b50d9126115b6fb33954fee94a3b Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.201634 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:05 crc kubenswrapper[4867]: E0320 00:10:05.201765 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:05.701745827 +0000 UTC m=+219.928283344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.201858 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:05 crc kubenswrapper[4867]: E0320 00:10:05.205673 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:05.705664766 +0000 UTC m=+219.932202353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.214468 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4wmcx" podStartSLOduration=169.214450168 podStartE2EDuration="2m49.214450168s" podCreationTimestamp="2026-03-20 00:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:05.21095452 +0000 UTC m=+219.437492037" watchObservedRunningTime="2026-03-20 00:10:05.214450168 +0000 UTC m=+219.440987685" Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.272414 4867 patch_prober.go:28] interesting pod/router-default-5444994796-9n97f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 00:10:05 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Mar 20 00:10:05 crc kubenswrapper[4867]: [+]process-running ok Mar 20 00:10:05 crc kubenswrapper[4867]: healthz check failed Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.272694 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9n97f" podUID="7912604b-d8ba-4510-9694-c6778f3d1c8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.302978 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:05 crc kubenswrapper[4867]: E0320 00:10:05.303131 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:05.803108248 +0000 UTC m=+220.029645765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.303667 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:05 crc kubenswrapper[4867]: E0320 00:10:05.303960 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:05.80394982 +0000 UTC m=+220.030487337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.404275 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:05 crc kubenswrapper[4867]: E0320 00:10:05.404457 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:05.904433438 +0000 UTC m=+220.130970955 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.404672 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:05 crc kubenswrapper[4867]: E0320 00:10:05.404989 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:05.904981762 +0000 UTC m=+220.131519279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.447659 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.449471 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pvqbv"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.462437 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6fgw"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.463338 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qspsc"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.465003 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7smtj"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.472694 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566088-pbdgz"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.474707 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5"] Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.503290 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d4db17f_0c6c_45c8_bfea_5251e2f6690d.slice/crio-6c3f9054c3b61bcc358cc8152f18514bc5c8cded4ca75df4d24f75d832affa2e WatchSource:0}: Error finding container 6c3f9054c3b61bcc358cc8152f18514bc5c8cded4ca75df4d24f75d832affa2e: Status 404 returned error can't find the container with id 6c3f9054c3b61bcc358cc8152f18514bc5c8cded4ca75df4d24f75d832affa2e Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.505416 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:05 crc kubenswrapper[4867]: E0320 00:10:05.505585 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:06.005560293 +0000 UTC m=+220.232097810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.505692 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:05 crc kubenswrapper[4867]: E0320 00:10:05.506202 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:06.006188399 +0000 UTC m=+220.232725916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.527106 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fxnfx"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.533514 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.563315 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b4cbk"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.564401 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.566383 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-v4wzv"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.568044 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bwpfb"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.569903 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-r84fp"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.572025 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.573550 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.575065 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xd2f2"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.576876 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-p9hfb"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.578670 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ghlsv"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.580313 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tp4c2"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.582268 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz"] Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.606568 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:05 crc kubenswrapper[4867]: E0320 00:10:05.606924 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:06.106908764 +0000 UTC m=+220.333446281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.621433 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fceb997_3744_48c3_94fd_4960c139b01d.slice/crio-977c2da33d19dc1d8b93841ced1edd692136fa418afd2501f139a3c673cbf432 WatchSource:0}: Error finding container 977c2da33d19dc1d8b93841ced1edd692136fa418afd2501f139a3c673cbf432: Status 404 returned error can't find the container with id 977c2da33d19dc1d8b93841ced1edd692136fa418afd2501f139a3c673cbf432 Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.623478 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c335827_0757_4eb8_83ec_13fdfd9ee948.slice/crio-d623a55740e8fde3cbae00133782fce2f9dd27193521b19db7ec449a94604480 WatchSource:0}: Error finding container d623a55740e8fde3cbae00133782fce2f9dd27193521b19db7ec449a94604480: Status 404 returned error can't find the container with id d623a55740e8fde3cbae00133782fce2f9dd27193521b19db7ec449a94604480 Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.624465 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod585bf14f_1828_4820_b5b2_2aa8992b4499.slice/crio-3f01037d43db96f59ec3d9d8af054f2e3ea7b27430bc6fed5f70320eebe223e7 WatchSource:0}: Error finding container 3f01037d43db96f59ec3d9d8af054f2e3ea7b27430bc6fed5f70320eebe223e7: Status 404 returned error can't find the container with id 3f01037d43db96f59ec3d9d8af054f2e3ea7b27430bc6fed5f70320eebe223e7 Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.625716 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4fhrs"] Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.627561 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod056f78fa_8b9c_4997_9d3a_00103d32eb09.slice/crio-d3b7a85d29c31e819df34c78511d8f6d4aadc5ee3364f0e87d8f9e127dc7546f WatchSource:0}: Error finding container d3b7a85d29c31e819df34c78511d8f6d4aadc5ee3364f0e87d8f9e127dc7546f: Status 404 returned error can't find the container with id d3b7a85d29c31e819df34c78511d8f6d4aadc5ee3364f0e87d8f9e127dc7546f Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.630378 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42544638_2b4a_4acd_bd73_e1fc16a3c921.slice/crio-9baa3e518b735d9936b6df2686df5e6eeadc6038217b693330543005ab8a7a3c WatchSource:0}: Error finding container 9baa3e518b735d9936b6df2686df5e6eeadc6038217b693330543005ab8a7a3c: Status 404 returned error can't find the container with id 9baa3e518b735d9936b6df2686df5e6eeadc6038217b693330543005ab8a7a3c Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.631451 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode98ce093_beb3_4d6c_be7b_8afa1e7b514e.slice/crio-bbf3d30864b1ad0091969fc8ac60d26812438d9f0cf3c78ac6ef1b3ea8b28f71 WatchSource:0}: Error finding container bbf3d30864b1ad0091969fc8ac60d26812438d9f0cf3c78ac6ef1b3ea8b28f71: Status 404 returned error can't find the container with id bbf3d30864b1ad0091969fc8ac60d26812438d9f0cf3c78ac6ef1b3ea8b28f71 Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.631993 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eb876e4_bbb0_4359_9fc7_eea081c878e1.slice/crio-201cd828c3754a354debb05e1ed8e00e514ff2c2cf5fe2894af3e5e998fd68ac WatchSource:0}: Error finding container 201cd828c3754a354debb05e1ed8e00e514ff2c2cf5fe2894af3e5e998fd68ac: Status 404 returned error can't find the container with id 201cd828c3754a354debb05e1ed8e00e514ff2c2cf5fe2894af3e5e998fd68ac Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.633045 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2635e37_95ea_4edd_a446_2143480b4052.slice/crio-0a7edf42938363a76535eaf81f190ad667a526f145cefc341a4e9e0842b81541 WatchSource:0}: Error finding container 0a7edf42938363a76535eaf81f190ad667a526f145cefc341a4e9e0842b81541: Status 404 returned error can't find the container with id 0a7edf42938363a76535eaf81f190ad667a526f145cefc341a4e9e0842b81541 Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.635450 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb1e4d21_3692_4e9d_914c_12cd30f641fa.slice/crio-01efea2c9a61968229cf21c8c819f3abb71fe1ea8addd4b2a0ac127e375cabb6 WatchSource:0}: Error finding container 01efea2c9a61968229cf21c8c819f3abb71fe1ea8addd4b2a0ac127e375cabb6: Status 404 returned error can't find the container with id 01efea2c9a61968229cf21c8c819f3abb71fe1ea8addd4b2a0ac127e375cabb6 Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.640240 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3283192e_4886_4409_8a9e_dda503c25e85.slice/crio-838be937bea1bda9f9311ac0d40d81c6300b9a59b85ffdef32036390bf613f3d WatchSource:0}: Error finding container 838be937bea1bda9f9311ac0d40d81c6300b9a59b85ffdef32036390bf613f3d: Status 404 returned error can't find the container with id 838be937bea1bda9f9311ac0d40d81c6300b9a59b85ffdef32036390bf613f3d Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.640456 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e954a18_5e7d_4cbf_afc5_6ef3b1afbd5a.slice/crio-ff3b124ea95a14ea9e08eaa4867cd85669b6b4a963cc039d558ed44a27067320 WatchSource:0}: Error finding container ff3b124ea95a14ea9e08eaa4867cd85669b6b4a963cc039d558ed44a27067320: Status 404 returned error can't find the container with id ff3b124ea95a14ea9e08eaa4867cd85669b6b4a963cc039d558ed44a27067320 Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.642201 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8171ee3e_3418_4537_b2a5_52941744cba6.slice/crio-a5ef5d613deb57b87953e659981a9651841bc8a762601d2eee28c7448a2c5455 WatchSource:0}: Error finding container a5ef5d613deb57b87953e659981a9651841bc8a762601d2eee28c7448a2c5455: Status 404 returned error can't find the container with id a5ef5d613deb57b87953e659981a9651841bc8a762601d2eee28c7448a2c5455 Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.643353 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7921e429_29e3_4d4e_bddd_8b99537ca63f.slice/crio-19757bf40b086184b3df363e0597453dc678db1c73c1605e6fa0a7a9f4292bd0 WatchSource:0}: Error finding container 19757bf40b086184b3df363e0597453dc678db1c73c1605e6fa0a7a9f4292bd0: Status 404 returned error can't find the container with id 19757bf40b086184b3df363e0597453dc678db1c73c1605e6fa0a7a9f4292bd0 Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.649806 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b014b7a_971f_468b_ab6d_ff31adaae80c.slice/crio-0e4992247ada806c60a04d515ca39730ac36769f9a3c1a3a770ab5a258bc38d0 WatchSource:0}: Error finding container 0e4992247ada806c60a04d515ca39730ac36769f9a3c1a3a770ab5a258bc38d0: Status 404 returned error can't find the container with id 0e4992247ada806c60a04d515ca39730ac36769f9a3c1a3a770ab5a258bc38d0 Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.656175 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5fc6d4e_37ff_406f_9de6_cc2a6cba39ce.slice/crio-c433bfbe70451694c72c12339b685ec3a8df0ed54078619d99111767fe3144fb WatchSource:0}: Error finding container c433bfbe70451694c72c12339b685ec3a8df0ed54078619d99111767fe3144fb: Status 404 returned error can't find the container with id c433bfbe70451694c72c12339b685ec3a8df0ed54078619d99111767fe3144fb Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.670253 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qnw4x"] Mar 20 00:10:05 crc kubenswrapper[4867]: W0320 00:10:05.701464 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd4e6e62_b584_4d8b_b697_800317b3a3a1.slice/crio-7a25adb04befc2cc57cb6ac5f512f8f80fe5970f784990fe9c2640ea79a5d280 WatchSource:0}: Error finding container 7a25adb04befc2cc57cb6ac5f512f8f80fe5970f784990fe9c2640ea79a5d280: Status 404 returned error can't find the container with id 7a25adb04befc2cc57cb6ac5f512f8f80fe5970f784990fe9c2640ea79a5d280 Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.707627 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:05 crc kubenswrapper[4867]: E0320 00:10:05.708110 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:06.208096111 +0000 UTC m=+220.434633628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.817035 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:05 crc kubenswrapper[4867]: E0320 00:10:05.817216 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:06.317185207 +0000 UTC m=+220.543722724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.817643 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:05 crc kubenswrapper[4867]: E0320 00:10:05.817915 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:06.317903805 +0000 UTC m=+220.544441322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.865457 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.866191 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.900716 4867 patch_prober.go:28] interesting pod/apiserver-76f77b778f-vbfqm container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 00:10:05 crc kubenswrapper[4867]: [+]log ok Mar 20 00:10:05 crc kubenswrapper[4867]: [+]etcd ok Mar 20 00:10:05 crc kubenswrapper[4867]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 00:10:05 crc kubenswrapper[4867]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 00:10:05 crc kubenswrapper[4867]: [+]poststarthook/max-in-flight-filter ok Mar 20 00:10:05 crc kubenswrapper[4867]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 00:10:05 crc kubenswrapper[4867]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 20 00:10:05 crc kubenswrapper[4867]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 20 00:10:05 crc kubenswrapper[4867]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 20 00:10:05 crc kubenswrapper[4867]: [+]poststarthook/project.openshift.io-projectcache ok Mar 20 00:10:05 crc kubenswrapper[4867]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 20 00:10:05 crc kubenswrapper[4867]: [+]poststarthook/openshift.io-startinformers ok Mar 20 00:10:05 crc kubenswrapper[4867]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 20 00:10:05 crc kubenswrapper[4867]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 00:10:05 crc kubenswrapper[4867]: livez check failed Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.900780 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" podUID="c75b5610-6996-4d26-b251-04176399fd0b" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 00:10:05 crc kubenswrapper[4867]: I0320 00:10:05.921560 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:05 crc kubenswrapper[4867]: E0320 00:10:05.921947 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:06.421931393 +0000 UTC m=+220.648468910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.023385 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:06 crc kubenswrapper[4867]: E0320 00:10:06.023822 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:06.523810767 +0000 UTC m=+220.750348284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.068243 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r84fp" event={"ID":"a2635e37-95ea-4edd-a446-2143480b4052","Type":"ContainerStarted","Data":"0a7edf42938363a76535eaf81f190ad667a526f145cefc341a4e9e0842b81541"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.069349 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" event={"ID":"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce","Type":"ContainerStarted","Data":"c433bfbe70451694c72c12339b685ec3a8df0ed54078619d99111767fe3144fb"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.070247 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566090-fczg6" event={"ID":"a4b1008c-a714-49a2-8c17-e971afc302af","Type":"ContainerStarted","Data":"d9874503b86a06a232a1ce0c7a309a3e4c773b14962b9d3286eb5221782a1e69"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.074942 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566088-pbdgz" event={"ID":"e91db560-3692-4697-bbf0-3c5a8438d5e5","Type":"ContainerStarted","Data":"ff1968471b32a160d4d09b2d1828d15e7279f81c9a3b439e1dcbd29bdde35523"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.076297 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" event={"ID":"7921e429-29e3-4d4e-bddd-8b99537ca63f","Type":"ContainerStarted","Data":"19757bf40b086184b3df363e0597453dc678db1c73c1605e6fa0a7a9f4292bd0"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.077160 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z" event={"ID":"8171ee3e-3418-4537-b2a5-52941744cba6","Type":"ContainerStarted","Data":"a5ef5d613deb57b87953e659981a9651841bc8a762601d2eee28c7448a2c5455"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.079195 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf" event={"ID":"0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a","Type":"ContainerStarted","Data":"ff3b124ea95a14ea9e08eaa4867cd85669b6b4a963cc039d558ed44a27067320"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.081060 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mzkng" event={"ID":"1656bc75-cc92-468d-b969-0fe7cb29c705","Type":"ContainerStarted","Data":"dfb655977b605a61241068f97059ba359b70904c3d3774703af1a3d7578cfcf8"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.081086 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mzkng" event={"ID":"1656bc75-cc92-468d-b969-0fe7cb29c705","Type":"ContainerStarted","Data":"e79323a14bebf1809d3b4d6c038f1188ded736463a0dd3349a52863ad36e40a8"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.083038 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ghlsv" event={"ID":"42544638-2b4a-4acd-bd73-e1fc16a3c921","Type":"ContainerStarted","Data":"9baa3e518b735d9936b6df2686df5e6eeadc6038217b693330543005ab8a7a3c"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.094747 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" event={"ID":"fb1e4d21-3692-4e9d-914c-12cd30f641fa","Type":"ContainerStarted","Data":"01efea2c9a61968229cf21c8c819f3abb71fe1ea8addd4b2a0ac127e375cabb6"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.099520 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" event={"ID":"585bf14f-1828-4820-b5b2-2aa8992b4499","Type":"ContainerStarted","Data":"3f01037d43db96f59ec3d9d8af054f2e3ea7b27430bc6fed5f70320eebe223e7"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.100750 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b4cbk" event={"ID":"7c335827-0757-4eb8-83ec-13fdfd9ee948","Type":"ContainerStarted","Data":"d623a55740e8fde3cbae00133782fce2f9dd27193521b19db7ec449a94604480"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.103865 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-r2ssr" event={"ID":"2f35b5e9-505a-4269-a74d-7a4523284289","Type":"ContainerStarted","Data":"bfea3fca1840d61947ca21dcf721a1d4e9bbee1d7c734212186905d500a89a5b"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.103911 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-r2ssr" event={"ID":"2f35b5e9-505a-4269-a74d-7a4523284289","Type":"ContainerStarted","Data":"58fe0de5f9410cca895d833acb7739ca84e4b50d9126115b6fb33954fee94a3b"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.105072 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-r2ssr" Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.107714 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-mzkng" podStartSLOduration=169.107701707 podStartE2EDuration="2m49.107701707s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:06.102839084 +0000 UTC m=+220.329376591" watchObservedRunningTime="2026-03-20 00:10:06.107701707 +0000 UTC m=+220.334239224" Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.112561 4867 patch_prober.go:28] interesting pod/console-operator-58897d9998-r2ssr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.112618 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-r2ssr" podUID="2f35b5e9-505a-4269-a74d-7a4523284289" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.113373 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5" event={"ID":"4cb00f5a-6c08-44bf-9f41-fd4160b07716","Type":"ContainerStarted","Data":"3aa6c145f9e4c0dc7334d0a29dcc4e898a0e4e79e36109f9e9997ffff4f42759"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.120308 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz" event={"ID":"e98ce093-beb3-4d6c-be7b-8afa1e7b514e","Type":"ContainerStarted","Data":"bbf3d30864b1ad0091969fc8ac60d26812438d9f0cf3c78ac6ef1b3ea8b28f71"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.126060 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:06 crc kubenswrapper[4867]: E0320 00:10:06.126381 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:06.626367569 +0000 UTC m=+220.852905086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.127216 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:06 crc kubenswrapper[4867]: E0320 00:10:06.127843 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:06.627835186 +0000 UTC m=+220.854372703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.128363 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-r2ssr" podStartSLOduration=169.128338698 podStartE2EDuration="2m49.128338698s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:06.12801488 +0000 UTC m=+220.354552397" watchObservedRunningTime="2026-03-20 00:10:06.128338698 +0000 UTC m=+220.354876215" Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.142262 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zvttc" event={"ID":"a6b937c9-ef34-4082-9daf-58994e3a7272","Type":"ContainerDied","Data":"ac737fadff4b7b9f3cd5b2c2534e5400591c67727b141c3e1ddf6ec329455713"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.142143 4867 generic.go:334] "Generic (PLEG): container finished" podID="a6b937c9-ef34-4082-9daf-58994e3a7272" containerID="ac737fadff4b7b9f3cd5b2c2534e5400591c67727b141c3e1ddf6ec329455713" exitCode=0 Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.152342 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4"] Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.164079 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nmk6s"] Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.164279 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" podUID="3a079af5-b58c-44d4-baa5-fc1bfab08cbb" containerName="controller-manager" containerID="cri-o://04c6a037613d47efe74cb9c13ba1450b716591c1b38182a8dfa36db0714fa351" gracePeriod=30 Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.190641 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" event={"ID":"056f78fa-8b9c-4997-9d3a-00103d32eb09","Type":"ContainerStarted","Data":"d3b7a85d29c31e819df34c78511d8f6d4aadc5ee3364f0e87d8f9e127dc7546f"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.193427 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7smtj" event={"ID":"774b2ca8-d862-4e99-ab70-ca68cfc5122c","Type":"ContainerStarted","Data":"e93ca1239cc7ce71274d4113200cb134d60c4772e59a4569ebc3066adc633739"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.216802 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xd2f2" event={"ID":"2b014b7a-971f-468b-ab6d-ff31adaae80c","Type":"ContainerStarted","Data":"0e4992247ada806c60a04d515ca39730ac36769f9a3c1a3a770ab5a258bc38d0"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.218155 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qspsc" event={"ID":"7d4db17f-0c6c-45c8-bfea-5251e2f6690d","Type":"ContainerStarted","Data":"6c3f9054c3b61bcc358cc8152f18514bc5c8cded4ca75df4d24f75d832affa2e"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.220079 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6z9p7" event={"ID":"4da3cc8b-1975-4899-a45a-465cd4c5170d","Type":"ContainerStarted","Data":"be5c868d7201bf6d36b8e12b653d6d3b1fd1be3aad736627ea3583686ec8166d"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.220098 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6z9p7" event={"ID":"4da3cc8b-1975-4899-a45a-465cd4c5170d","Type":"ContainerStarted","Data":"b2865687f0e7cec1e2d24e99cab32141b94b9ac20e614f5b1ee1411070aa6931"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.222148 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-fls4v" event={"ID":"d7f6765c-8450-44ad-895b-0c6b30ed4aa3","Type":"ContainerStarted","Data":"d0c0da4a2a61dedd92f9b0aac4d5ca0faadedc6b2fdb7863295afb30ceb95b9a"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.222196 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-fls4v" event={"ID":"d7f6765c-8450-44ad-895b-0c6b30ed4aa3","Type":"ContainerStarted","Data":"1991230a99d931497a2fb5c64c56e709da0e252f75aa8577a6d9590c6ba5c54f"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.224267 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tp4c2" event={"ID":"3283192e-4886-4409-8a9e-dda503c25e85","Type":"ContainerStarted","Data":"838be937bea1bda9f9311ac0d40d81c6300b9a59b85ffdef32036390bf613f3d"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.228960 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:06 crc kubenswrapper[4867]: E0320 00:10:06.230121 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:06.73010455 +0000 UTC m=+220.956642067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.230772 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2" event={"ID":"98a0a088-a592-47fd-b92a-b73db5432a7b","Type":"ContainerStarted","Data":"627192c62746995655885d89b115c2070cf5293c437e58370a5569b7d6a56dbf"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.230813 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2" event={"ID":"98a0a088-a592-47fd-b92a-b73db5432a7b","Type":"ContainerStarted","Data":"b62f29e42fa87e465eb256b2995a3cd4677be41108a0a1b5498370345a27132c"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.234783 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qnw4x" event={"ID":"cd4e6e62-b584-4d8b-b697-800317b3a3a1","Type":"ContainerStarted","Data":"7a25adb04befc2cc57cb6ac5f512f8f80fe5970f784990fe9c2640ea79a5d280"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.239452 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fxnfx" event={"ID":"5fceb997-3744-48c3-94fd-4960c139b01d","Type":"ContainerStarted","Data":"977c2da33d19dc1d8b93841ced1edd692136fa418afd2501f139a3c673cbf432"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.266885 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-fls4v" podStartSLOduration=169.266868848 podStartE2EDuration="2m49.266868848s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:06.266383546 +0000 UTC m=+220.492921063" watchObservedRunningTime="2026-03-20 00:10:06.266868848 +0000 UTC m=+220.493406365" Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.268350 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6z9p7" podStartSLOduration=169.268342786 podStartE2EDuration="2m49.268342786s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:06.244821121 +0000 UTC m=+220.471358638" watchObservedRunningTime="2026-03-20 00:10:06.268342786 +0000 UTC m=+220.494880303" Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.269772 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29566080-vq54z" event={"ID":"c74119c8-4cf1-4c7b-9dba-49421dfe52e6","Type":"ContainerStarted","Data":"c79a3591c07bd56445fbc0490fd74c3358c507e92900bf951e368bc70d944d72"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.269800 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29566080-vq54z" event={"ID":"c74119c8-4cf1-4c7b-9dba-49421dfe52e6","Type":"ContainerStarted","Data":"f7363620a6e65acd15e636661b16c6bfd5a0b1da89fd34e0f5091acbc1abe61c"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.274832 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-p9hfb" event={"ID":"6eb876e4-bbb0-4359-9fc7-eea081c878e1","Type":"ContainerStarted","Data":"201cd828c3754a354debb05e1ed8e00e514ff2c2cf5fe2894af3e5e998fd68ac"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.275886 4867 patch_prober.go:28] interesting pod/router-default-5444994796-9n97f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 00:10:06 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Mar 20 00:10:06 crc kubenswrapper[4867]: [+]process-running ok Mar 20 00:10:06 crc kubenswrapper[4867]: healthz check failed Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.275919 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9n97f" podUID="7912604b-d8ba-4510-9694-c6778f3d1c8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.281561 4867 generic.go:334] "Generic (PLEG): container finished" podID="7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7" containerID="070d9cc2f0fb3ba3ffcd1ed2dd05f2348c3200bb3308a153c29dd020383b7d05" exitCode=0 Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.281617 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" event={"ID":"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7","Type":"ContainerDied","Data":"070d9cc2f0fb3ba3ffcd1ed2dd05f2348c3200bb3308a153c29dd020383b7d05"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.281637 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" event={"ID":"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7","Type":"ContainerStarted","Data":"94862319c561997b6cc777c63d2b232702c86d9ff6bda1f13190954dca9db0a3"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.314987 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj" event={"ID":"b02d37d7-8347-4725-8964-2adad8d0e673","Type":"ContainerStarted","Data":"bc199bbe343c07771f2825b318d95fa9c50c8d4fdbd408b4d35baad8762cce31"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.315039 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj" event={"ID":"b02d37d7-8347-4725-8964-2adad8d0e673","Type":"ContainerStarted","Data":"f88a8d42edbd5c476eee5c06cb271ca2e47146397f72de83fc05ef57aaa01710"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.331329 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:06 crc kubenswrapper[4867]: E0320 00:10:06.332301 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:06.832284331 +0000 UTC m=+221.058821848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.342477 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zd5l2" podStartSLOduration=169.342460978 podStartE2EDuration="2m49.342460978s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:06.30336378 +0000 UTC m=+220.529901317" watchObservedRunningTime="2026-03-20 00:10:06.342460978 +0000 UTC m=+220.568998495" Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.343644 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29566080-vq54z" podStartSLOduration=170.343637618 podStartE2EDuration="2m50.343637618s" podCreationTimestamp="2026-03-20 00:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:06.339801071 +0000 UTC m=+220.566338608" watchObservedRunningTime="2026-03-20 00:10:06.343637618 +0000 UTC m=+220.570175135" Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.374073 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6fgw" event={"ID":"330066b4-36c9-4414-aa1d-788a59b3ce13","Type":"ContainerStarted","Data":"2bba7279a64c5aa961a096aaeb4ce08ebfadb45407368beb85949882b6b5cfeb"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.374122 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6fgw" event={"ID":"330066b4-36c9-4414-aa1d-788a59b3ce13","Type":"ContainerStarted","Data":"526225e18c64601e5613db1fefc59f75e1cda36964ff6ca99e9e07668d733123"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.382529 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvqbv" event={"ID":"6817894e-4517-4622-9be5-3b6b329b4d31","Type":"ContainerStarted","Data":"a1eb62ab52a3af5f2ad15436421a06dbb0e048713d8d3e551af9dbcc0dbadafd"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.382563 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvqbv" event={"ID":"6817894e-4517-4622-9be5-3b6b329b4d31","Type":"ContainerStarted","Data":"920b43648a9dd8d1e28cdf6b6995fb98f993e47488723d6bd790984fbff39f8e"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.389530 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z88lc" event={"ID":"cfaa1b05-7c04-42f5-a077-6dc6b19084b0","Type":"ContainerStarted","Data":"8b288a551bdf4fe2385d47135f6a9adcbed7d27c4bc49c79b4bc64d5fb06e824"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.411769 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z88lc" event={"ID":"cfaa1b05-7c04-42f5-a077-6dc6b19084b0","Type":"ContainerStarted","Data":"2b55255a27698d028748c08701d510e111953d45d6a377a33487e38d6168e846"} Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.424742 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z6fgw" podStartSLOduration=169.424721827 podStartE2EDuration="2m49.424721827s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:06.41731585 +0000 UTC m=+220.643853367" watchObservedRunningTime="2026-03-20 00:10:06.424721827 +0000 UTC m=+220.651259364" Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.435704 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:06 crc kubenswrapper[4867]: E0320 00:10:06.436209 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:06.936191817 +0000 UTC m=+221.162729334 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.437626 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:06 crc kubenswrapper[4867]: E0320 00:10:06.440531 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:06.940511736 +0000 UTC m=+221.167049253 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.442615 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spjhj" podStartSLOduration=169.442604238 podStartE2EDuration="2m49.442604238s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:06.436895804 +0000 UTC m=+220.663433321" watchObservedRunningTime="2026-03-20 00:10:06.442604238 +0000 UTC m=+220.669141755" Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.470757 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pvqbv" podStartSLOduration=169.470741519 podStartE2EDuration="2m49.470741519s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:06.469032816 +0000 UTC m=+220.695570333" watchObservedRunningTime="2026-03-20 00:10:06.470741519 +0000 UTC m=+220.697279036" Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.538950 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:06 crc kubenswrapper[4867]: E0320 00:10:06.539054 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:07.039030815 +0000 UTC m=+221.265568332 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.539554 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:06 crc kubenswrapper[4867]: E0320 00:10:06.541218 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:07.04120125 +0000 UTC m=+221.267738767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.640131 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:06 crc kubenswrapper[4867]: E0320 00:10:06.640426 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:07.140412296 +0000 UTC m=+221.366949803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.741789 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:06 crc kubenswrapper[4867]: E0320 00:10:06.742428 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:07.242414143 +0000 UTC m=+221.468951660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.814894 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.842561 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:06 crc kubenswrapper[4867]: E0320 00:10:06.842919 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:07.342903772 +0000 UTC m=+221.569441289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.956078 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-proxy-ca-bundles\") pod \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.956125 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-config\") pod \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.956159 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-client-ca\") pod \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.956185 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-serving-cert\") pod \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.956339 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7w4q\" (UniqueName: \"kubernetes.io/projected/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-kube-api-access-r7w4q\") pod \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\" (UID: \"3a079af5-b58c-44d4-baa5-fc1bfab08cbb\") " Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.956484 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:06 crc kubenswrapper[4867]: E0320 00:10:06.956753 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:07.456742769 +0000 UTC m=+221.683280276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.957101 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-client-ca" (OuterVolumeSpecName: "client-ca") pod "3a079af5-b58c-44d4-baa5-fc1bfab08cbb" (UID: "3a079af5-b58c-44d4-baa5-fc1bfab08cbb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.957136 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3a079af5-b58c-44d4-baa5-fc1bfab08cbb" (UID: "3a079af5-b58c-44d4-baa5-fc1bfab08cbb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.958125 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-config" (OuterVolumeSpecName: "config") pod "3a079af5-b58c-44d4-baa5-fc1bfab08cbb" (UID: "3a079af5-b58c-44d4-baa5-fc1bfab08cbb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.971115 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-kube-api-access-r7w4q" (OuterVolumeSpecName: "kube-api-access-r7w4q") pod "3a079af5-b58c-44d4-baa5-fc1bfab08cbb" (UID: "3a079af5-b58c-44d4-baa5-fc1bfab08cbb"). InnerVolumeSpecName "kube-api-access-r7w4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:10:06 crc kubenswrapper[4867]: I0320 00:10:06.976588 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3a079af5-b58c-44d4-baa5-fc1bfab08cbb" (UID: "3a079af5-b58c-44d4-baa5-fc1bfab08cbb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.056995 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.057357 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7w4q\" (UniqueName: \"kubernetes.io/projected/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-kube-api-access-r7w4q\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.057370 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.057379 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.057387 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.057394 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a079af5-b58c-44d4-baa5-fc1bfab08cbb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:07 crc kubenswrapper[4867]: E0320 00:10:07.057451 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:07.557436483 +0000 UTC m=+221.783973990 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.158865 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:07 crc kubenswrapper[4867]: E0320 00:10:07.159374 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:07.659362878 +0000 UTC m=+221.885900395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.259645 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:07 crc kubenswrapper[4867]: E0320 00:10:07.260057 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:07.760042002 +0000 UTC m=+221.986579519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.283601 4867 patch_prober.go:28] interesting pod/router-default-5444994796-9n97f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 00:10:07 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Mar 20 00:10:07 crc kubenswrapper[4867]: [+]process-running ok Mar 20 00:10:07 crc kubenswrapper[4867]: healthz check failed Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.283669 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9n97f" podUID="7912604b-d8ba-4510-9694-c6778f3d1c8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.363747 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:07 crc kubenswrapper[4867]: E0320 00:10:07.364357 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:07.864344817 +0000 UTC m=+222.090882334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.408317 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" event={"ID":"fb1e4d21-3692-4e9d-914c-12cd30f641fa","Type":"ContainerStarted","Data":"647c5eeecb834a87f127ff427d67b81a9a3fd038c51fa522e38dae78f0e95191"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.409526 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.419635 4867 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-bwpfb container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.28:6443/healthz\": dial tcp 10.217.0.28:6443: connect: connection refused" start-of-body= Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.419676 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" podUID="fb1e4d21-3692-4e9d-914c-12cd30f641fa" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.28:6443/healthz\": dial tcp 10.217.0.28:6443: connect: connection refused" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.443664 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z88lc" event={"ID":"cfaa1b05-7c04-42f5-a077-6dc6b19084b0","Type":"ContainerStarted","Data":"7ed7a238fc32c7c10007ad14e4d559d755ffa15ce50d3081f94ca007c38e7879"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.444299 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-z88lc" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.453595 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" podStartSLOduration=171.453578032 podStartE2EDuration="2m51.453578032s" podCreationTimestamp="2026-03-20 00:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:07.451477068 +0000 UTC m=+221.678014585" watchObservedRunningTime="2026-03-20 00:10:07.453578032 +0000 UTC m=+221.680115549" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.465406 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:07 crc kubenswrapper[4867]: E0320 00:10:07.465975 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:07.965948964 +0000 UTC m=+222.192486551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.466378 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:07 crc kubenswrapper[4867]: E0320 00:10:07.467787 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:07.96777381 +0000 UTC m=+222.194311317 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.469116 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-p9hfb" event={"ID":"6eb876e4-bbb0-4359-9fc7-eea081c878e1","Type":"ContainerStarted","Data":"3bb7e38521cbd3d72c35ed1a4a3bdcd3794e569bf377b0ee5f547191dba875b6"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.469162 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-p9hfb" event={"ID":"6eb876e4-bbb0-4359-9fc7-eea081c878e1","Type":"ContainerStarted","Data":"d5c752b7daf965c524301faa4e600cc627f4e808bdc2634d1343116ac1e00a4e"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.506907 4867 ???:1] "http: TLS handshake error from 192.168.126.11:60964: no serving certificate available for the kubelet" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.510734 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-p9hfb" podStartSLOduration=170.510717605 podStartE2EDuration="2m50.510717605s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:07.509908425 +0000 UTC m=+221.736445932" watchObservedRunningTime="2026-03-20 00:10:07.510717605 +0000 UTC m=+221.737255122" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.511189 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-z88lc" podStartSLOduration=8.511182127 podStartE2EDuration="8.511182127s" podCreationTimestamp="2026-03-20 00:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:07.483568759 +0000 UTC m=+221.710106296" watchObservedRunningTime="2026-03-20 00:10:07.511182127 +0000 UTC m=+221.737719644" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.515002 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf" event={"ID":"0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a","Type":"ContainerStarted","Data":"3562d4b4a02e55131c2d93d75b5fa864405ebfc2069e1d9274b7cb9bd15bda03"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.515209 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf" event={"ID":"0e954a18-5e7d-4cbf-afc5-6ef3b1afbd5a","Type":"ContainerStarted","Data":"b15fea312455759f6aec9bac7cff750e10369592675787542ce4efa958ef34a7"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.531583 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zvttc" event={"ID":"a6b937c9-ef34-4082-9daf-58994e3a7272","Type":"ContainerStarted","Data":"afbefb7faf447f433b7d424ebd15b149ebe48b568917a5399697c09c59dde68a"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.532170 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zvttc" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.557373 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" event={"ID":"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce","Type":"ContainerStarted","Data":"a4c8dad9a7ef27f2a3665157f06c09e88512ae2b08f48f7f52b6eea1574609ab"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.558440 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.566570 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" event={"ID":"7921e429-29e3-4d4e-bddd-8b99537ca63f","Type":"ContainerStarted","Data":"b6b8d454dd4d0e368f1f2e1d00d1265d95bd20d8170355935c27c50c3218350d"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.566773 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" podUID="7921e429-29e3-4d4e-bddd-8b99537ca63f" containerName="route-controller-manager" containerID="cri-o://b6b8d454dd4d0e368f1f2e1d00d1265d95bd20d8170355935c27c50c3218350d" gracePeriod=30 Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.567088 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.567593 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:07 crc kubenswrapper[4867]: E0320 00:10:07.568910 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:08.068893685 +0000 UTC m=+222.295431202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.569587 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4v7zf" podStartSLOduration=170.569568422 podStartE2EDuration="2m50.569568422s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:07.557639931 +0000 UTC m=+221.784177448" watchObservedRunningTime="2026-03-20 00:10:07.569568422 +0000 UTC m=+221.796105959" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.573376 4867 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4fhrs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.573434 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" podUID="a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.595715 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zvttc" podStartSLOduration=171.595694522 podStartE2EDuration="2m51.595694522s" podCreationTimestamp="2026-03-20 00:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:07.591721032 +0000 UTC m=+221.818258549" watchObservedRunningTime="2026-03-20 00:10:07.595694522 +0000 UTC m=+221.822232049" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.601804 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tp4c2" event={"ID":"3283192e-4886-4409-8a9e-dda503c25e85","Type":"ContainerStarted","Data":"362fe120f81fc9f9cab0cdaf4fd7e7f91918ac01f15790da1f164dd04368d9ca"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.602509 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-tp4c2" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.646049 4867 patch_prober.go:28] interesting pod/downloads-7954f5f757-tp4c2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.646122 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tp4c2" podUID="3283192e-4886-4409-8a9e-dda503c25e85" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.646317 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7smtj" event={"ID":"774b2ca8-d862-4e99-ab70-ca68cfc5122c","Type":"ContainerStarted","Data":"71530e26eb584c18802287e65bf19e4809f87a10a1c2b998a387a4b6db085575"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.662548 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" podStartSLOduration=170.662530371 podStartE2EDuration="2m50.662530371s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:07.630613814 +0000 UTC m=+221.857151331" watchObservedRunningTime="2026-03-20 00:10:07.662530371 +0000 UTC m=+221.889067888" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.670050 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:07 crc kubenswrapper[4867]: E0320 00:10:07.670532 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:08.170521673 +0000 UTC m=+222.397059190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.671877 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qnw4x" event={"ID":"cd4e6e62-b584-4d8b-b697-800317b3a3a1","Type":"ContainerStarted","Data":"5859413b37d001dd5c481f7e4989a0c9cfd3ed673638c99e2ebc6330dd887914"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.692765 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-tp4c2" podStartSLOduration=170.692751404 podStartE2EDuration="2m50.692751404s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:07.664937352 +0000 UTC m=+221.891474869" watchObservedRunningTime="2026-03-20 00:10:07.692751404 +0000 UTC m=+221.919288921" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.693979 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" podStartSLOduration=170.693972705 podStartE2EDuration="2m50.693972705s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:07.691685818 +0000 UTC m=+221.918223335" watchObservedRunningTime="2026-03-20 00:10:07.693972705 +0000 UTC m=+221.920510232" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.711572 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b4cbk" event={"ID":"7c335827-0757-4eb8-83ec-13fdfd9ee948","Type":"ContainerStarted","Data":"31bcc93259d97ed7c66a3896ca8d75b29b9d21666d3d2df2317a34400a0602ed"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.722877 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qspsc" event={"ID":"7d4db17f-0c6c-45c8-bfea-5251e2f6690d","Type":"ContainerStarted","Data":"cff7fc5db25a5bd095765ceeebddee2fdc04fa5054ab272bb847692d7abdbdb7"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.734000 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ghlsv" event={"ID":"42544638-2b4a-4acd-bd73-e1fc16a3c921","Type":"ContainerStarted","Data":"fe7afd8d737f549e76c295dea0d350340579d39cd4d1d3cfaa07b524f14b7585"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.734045 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ghlsv" event={"ID":"42544638-2b4a-4acd-bd73-e1fc16a3c921","Type":"ContainerStarted","Data":"b0f191a6f156fb65cdb00bfd0e0ccb18fe0f3b0ea476fd51989b69ebc63b8aa2"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.736570 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fxnfx" event={"ID":"5fceb997-3744-48c3-94fd-4960c139b01d","Type":"ContainerStarted","Data":"249958a2083f41578a0aae66834ccf0ffb0cb2c554f2fdf42ad90f88626cd2cd"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.736594 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fxnfx" event={"ID":"5fceb997-3744-48c3-94fd-4960c139b01d","Type":"ContainerStarted","Data":"d5cb00fb456b97cb596fa9301aadf48699cbc3f32c0c94b4640ee3d16168f238"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.757063 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xd2f2" event={"ID":"2b014b7a-971f-468b-ab6d-ff31adaae80c","Type":"ContainerStarted","Data":"c22f2456ad173a49a187d3c204c51477cd53c8901f936b03bd8fb711fbf2cabe"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.757110 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xd2f2" event={"ID":"2b014b7a-971f-468b-ab6d-ff31adaae80c","Type":"ContainerStarted","Data":"e50aeda849d0da11a9f9b7270fad6c9e7fabd34768ace480c0e14e1de55840cb"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.757725 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xd2f2" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.762278 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qnw4x" podStartSLOduration=8.762258291 podStartE2EDuration="8.762258291s" podCreationTimestamp="2026-03-20 00:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:07.719941611 +0000 UTC m=+221.946479128" watchObservedRunningTime="2026-03-20 00:10:07.762258291 +0000 UTC m=+221.988795808" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.770809 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:07 crc kubenswrapper[4867]: E0320 00:10:07.772034 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:08.271988766 +0000 UTC m=+222.498526283 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.777666 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r84fp" event={"ID":"a2635e37-95ea-4edd-a446-2143480b4052","Type":"ContainerStarted","Data":"ee55e4a733614403bf51687415f3d7f853ff1d37190742d750ecec4546d43e60"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.777794 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r84fp" event={"ID":"a2635e37-95ea-4edd-a446-2143480b4052","Type":"ContainerStarted","Data":"b4094502973d04730b1e563ec271315647987cbf77fc46164397568a1b79aaaa"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.815773 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b4cbk" podStartSLOduration=170.815753092 podStartE2EDuration="2m50.815753092s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:07.765109013 +0000 UTC m=+221.991646530" watchObservedRunningTime="2026-03-20 00:10:07.815753092 +0000 UTC m=+222.042290619" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.821982 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5" event={"ID":"4cb00f5a-6c08-44bf-9f41-fd4160b07716","Type":"ContainerStarted","Data":"ccb3fb5700ac65ef1b2218d407d90738094d887bce9cfccf81d9e2ef91388160"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.822854 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.834750 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.835757 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" event={"ID":"585bf14f-1828-4820-b5b2-2aa8992b4499","Type":"ContainerStarted","Data":"758cb8a5356309c77c75ade4b2115ab74710144ce0e4ef13ae0befe5b74598c3"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.836078 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.841971 4867 generic.go:334] "Generic (PLEG): container finished" podID="3a079af5-b58c-44d4-baa5-fc1bfab08cbb" containerID="04c6a037613d47efe74cb9c13ba1450b716591c1b38182a8dfa36db0714fa351" exitCode=0 Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.842053 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" event={"ID":"3a079af5-b58c-44d4-baa5-fc1bfab08cbb","Type":"ContainerDied","Data":"04c6a037613d47efe74cb9c13ba1450b716591c1b38182a8dfa36db0714fa351"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.842077 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" event={"ID":"3a079af5-b58c-44d4-baa5-fc1bfab08cbb","Type":"ContainerDied","Data":"8243a8e5435ae1e8d35eb6cca4db23e54f8d6696e921fcde1c55945fb8378e0d"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.842091 4867 scope.go:117] "RemoveContainer" containerID="04c6a037613d47efe74cb9c13ba1450b716591c1b38182a8dfa36db0714fa351" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.842200 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nmk6s" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.846373 4867 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fplqr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.846518 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" podUID="585bf14f-1828-4820-b5b2-2aa8992b4499" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.869778 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z" event={"ID":"8171ee3e-3418-4537-b2a5-52941744cba6","Type":"ContainerStarted","Data":"9a6283307cdb1720c3045784f846b11a3cf107ad1602de1f2ddd7dbc66461aa8"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.869818 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.873104 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:07 crc kubenswrapper[4867]: E0320 00:10:07.878649 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:08.378632061 +0000 UTC m=+222.605169578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.883206 4867 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-c8s9z container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.883242 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z" podUID="8171ee3e-3418-4537-b2a5-52941744cba6" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.898730 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" event={"ID":"056f78fa-8b9c-4997-9d3a-00103d32eb09","Type":"ContainerStarted","Data":"741bf412352605d8dc445271ccbabf343105d9da66386c4818dc4a3a76c0282f"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.907687 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qspsc" podStartSLOduration=170.907671645 podStartE2EDuration="2m50.907671645s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:07.815566247 +0000 UTC m=+222.042103774" watchObservedRunningTime="2026-03-20 00:10:07.907671645 +0000 UTC m=+222.134209162" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.912077 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xd2f2" podStartSLOduration=170.912062635 podStartE2EDuration="2m50.912062635s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:07.907288325 +0000 UTC m=+222.133825842" watchObservedRunningTime="2026-03-20 00:10:07.912062635 +0000 UTC m=+222.138600152" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.915247 4867 scope.go:117] "RemoveContainer" containerID="04c6a037613d47efe74cb9c13ba1450b716591c1b38182a8dfa36db0714fa351" Mar 20 00:10:07 crc kubenswrapper[4867]: E0320 00:10:07.925655 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04c6a037613d47efe74cb9c13ba1450b716591c1b38182a8dfa36db0714fa351\": container with ID starting with 04c6a037613d47efe74cb9c13ba1450b716591c1b38182a8dfa36db0714fa351 not found: ID does not exist" containerID="04c6a037613d47efe74cb9c13ba1450b716591c1b38182a8dfa36db0714fa351" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.925700 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c6a037613d47efe74cb9c13ba1450b716591c1b38182a8dfa36db0714fa351"} err="failed to get container status \"04c6a037613d47efe74cb9c13ba1450b716591c1b38182a8dfa36db0714fa351\": rpc error: code = NotFound desc = could not find container \"04c6a037613d47efe74cb9c13ba1450b716591c1b38182a8dfa36db0714fa351\": container with ID starting with 04c6a037613d47efe74cb9c13ba1450b716591c1b38182a8dfa36db0714fa351 not found: ID does not exist" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.929711 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz" event={"ID":"e98ce093-beb3-4d6c-be7b-8afa1e7b514e","Type":"ContainerStarted","Data":"84a42b53af3943d714a5034bb5b3cb81365f61bb89aaa6908ee21d05c116ba51"} Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.941060 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-r2ssr" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.944455 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ghlsv" podStartSLOduration=170.944437063 podStartE2EDuration="2m50.944437063s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:07.933909387 +0000 UTC m=+222.160446894" watchObservedRunningTime="2026-03-20 00:10:07.944437063 +0000 UTC m=+222.170974580" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.959904 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-fxnfx" podStartSLOduration=170.959891214 podStartE2EDuration="2m50.959891214s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:07.958708754 +0000 UTC m=+222.185246261" watchObservedRunningTime="2026-03-20 00:10:07.959891214 +0000 UTC m=+222.186428731" Mar 20 00:10:07 crc kubenswrapper[4867]: I0320 00:10:07.975761 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:07 crc kubenswrapper[4867]: E0320 00:10:07.977305 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:08.477289063 +0000 UTC m=+222.703826580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.017623 4867 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-4s8b4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": read tcp 10.217.0.2:47798->10.217.0.22:8443: read: connection reset by peer" start-of-body= Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.017954 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" podUID="7921e429-29e3-4d4e-bddd-8b99537ca63f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": read tcp 10.217.0.2:47798->10.217.0.22:8443: read: connection reset by peer" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.018861 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z" podStartSLOduration=171.018845303 podStartE2EDuration="2m51.018845303s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:08.016132625 +0000 UTC m=+222.242670142" watchObservedRunningTime="2026-03-20 00:10:08.018845303 +0000 UTC m=+222.245382820" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.066633 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz" podStartSLOduration=172.06661841 podStartE2EDuration="2m52.06661841s" podCreationTimestamp="2026-03-20 00:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:08.064747013 +0000 UTC m=+222.291284530" watchObservedRunningTime="2026-03-20 00:10:08.06661841 +0000 UTC m=+222.293155917" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.079137 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:08 crc kubenswrapper[4867]: E0320 00:10:08.079401 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:08.579388773 +0000 UTC m=+222.805926290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.102466 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r84fp" podStartSLOduration=171.102449236 podStartE2EDuration="2m51.102449236s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:08.100949298 +0000 UTC m=+222.327486815" watchObservedRunningTime="2026-03-20 00:10:08.102449236 +0000 UTC m=+222.328986753" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.120975 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" podStartSLOduration=171.120960503 podStartE2EDuration="2m51.120960503s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:08.119900967 +0000 UTC m=+222.346438484" watchObservedRunningTime="2026-03-20 00:10:08.120960503 +0000 UTC m=+222.347498020" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.145802 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nmk6s"] Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.145853 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nmk6s"] Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.180995 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:08 crc kubenswrapper[4867]: E0320 00:10:08.181271 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:08.681253217 +0000 UTC m=+222.907790734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.181565 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:08 crc kubenswrapper[4867]: E0320 00:10:08.182045 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:08.682030066 +0000 UTC m=+222.908567583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.201916 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-v4wzv" podStartSLOduration=171.201896508 podStartE2EDuration="2m51.201896508s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:08.199782605 +0000 UTC m=+222.426320122" watchObservedRunningTime="2026-03-20 00:10:08.201896508 +0000 UTC m=+222.428434035" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.280598 4867 patch_prober.go:28] interesting pod/router-default-5444994796-9n97f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 00:10:08 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Mar 20 00:10:08 crc kubenswrapper[4867]: [+]process-running ok Mar 20 00:10:08 crc kubenswrapper[4867]: healthz check failed Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.280664 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9n97f" podUID="7912604b-d8ba-4510-9694-c6778f3d1c8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.283186 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:08 crc kubenswrapper[4867]: E0320 00:10:08.283514 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:08.78348181 +0000 UTC m=+223.010019327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.384458 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:08 crc kubenswrapper[4867]: E0320 00:10:08.384776 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:08.884764799 +0000 UTC m=+223.111302316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.450430 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a079af5-b58c-44d4-baa5-fc1bfab08cbb" path="/var/lib/kubelet/pods/3a079af5-b58c-44d4-baa5-fc1bfab08cbb/volumes" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.485268 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:08 crc kubenswrapper[4867]: E0320 00:10:08.485466 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:08.985437742 +0000 UTC m=+223.211975259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.485528 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:08 crc kubenswrapper[4867]: E0320 00:10:08.485790 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:08.985777611 +0000 UTC m=+223.212315128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.504895 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2dzk5" podStartSLOduration=171.504863983 podStartE2EDuration="2m51.504863983s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:08.228261584 +0000 UTC m=+222.454799091" watchObservedRunningTime="2026-03-20 00:10:08.504863983 +0000 UTC m=+222.731401500" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.506061 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-9f5976f56-99fdm"] Mar 20 00:10:08 crc kubenswrapper[4867]: E0320 00:10:08.506257 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a079af5-b58c-44d4-baa5-fc1bfab08cbb" containerName="controller-manager" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.506271 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a079af5-b58c-44d4-baa5-fc1bfab08cbb" containerName="controller-manager" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.506391 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a079af5-b58c-44d4-baa5-fc1bfab08cbb" containerName="controller-manager" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.506827 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.512172 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.512258 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.513157 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.513200 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.513198 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.514305 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.519711 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9f5976f56-99fdm"] Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.531159 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.586928 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.587227 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n44s7\" (UniqueName: \"kubernetes.io/projected/d1e9196a-a19c-4c54-957f-dd4457cea9f7-kube-api-access-n44s7\") pod \"controller-manager-9f5976f56-99fdm\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.587304 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1e9196a-a19c-4c54-957f-dd4457cea9f7-client-ca\") pod \"controller-manager-9f5976f56-99fdm\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.587334 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1e9196a-a19c-4c54-957f-dd4457cea9f7-config\") pod \"controller-manager-9f5976f56-99fdm\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.587358 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1e9196a-a19c-4c54-957f-dd4457cea9f7-proxy-ca-bundles\") pod \"controller-manager-9f5976f56-99fdm\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.587387 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1e9196a-a19c-4c54-957f-dd4457cea9f7-serving-cert\") pod \"controller-manager-9f5976f56-99fdm\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:08 crc kubenswrapper[4867]: E0320 00:10:08.587517 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:09.08747543 +0000 UTC m=+223.314012947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.693442 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n44s7\" (UniqueName: \"kubernetes.io/projected/d1e9196a-a19c-4c54-957f-dd4457cea9f7-kube-api-access-n44s7\") pod \"controller-manager-9f5976f56-99fdm\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.693764 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.693818 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1e9196a-a19c-4c54-957f-dd4457cea9f7-client-ca\") pod \"controller-manager-9f5976f56-99fdm\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.693843 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1e9196a-a19c-4c54-957f-dd4457cea9f7-config\") pod \"controller-manager-9f5976f56-99fdm\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.693880 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1e9196a-a19c-4c54-957f-dd4457cea9f7-proxy-ca-bundles\") pod \"controller-manager-9f5976f56-99fdm\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.693925 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1e9196a-a19c-4c54-957f-dd4457cea9f7-serving-cert\") pod \"controller-manager-9f5976f56-99fdm\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:08 crc kubenswrapper[4867]: E0320 00:10:08.695141 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:09.195129329 +0000 UTC m=+223.421666846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.696240 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1e9196a-a19c-4c54-957f-dd4457cea9f7-client-ca\") pod \"controller-manager-9f5976f56-99fdm\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.696355 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1e9196a-a19c-4c54-957f-dd4457cea9f7-proxy-ca-bundles\") pod \"controller-manager-9f5976f56-99fdm\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.697289 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1e9196a-a19c-4c54-957f-dd4457cea9f7-config\") pod \"controller-manager-9f5976f56-99fdm\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.715238 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1e9196a-a19c-4c54-957f-dd4457cea9f7-serving-cert\") pod \"controller-manager-9f5976f56-99fdm\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.718061 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n44s7\" (UniqueName: \"kubernetes.io/projected/d1e9196a-a19c-4c54-957f-dd4457cea9f7-kube-api-access-n44s7\") pod \"controller-manager-9f5976f56-99fdm\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.796417 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:08 crc kubenswrapper[4867]: E0320 00:10:08.796791 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:09.296773737 +0000 UTC m=+223.523311254 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.829836 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.869349 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6576b87f9c-4s8b4_7921e429-29e3-4d4e-bddd-8b99537ca63f/route-controller-manager/0.log" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.869402 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.898729 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:08 crc kubenswrapper[4867]: E0320 00:10:08.899062 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:09.399048391 +0000 UTC m=+223.625585908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:08 crc kubenswrapper[4867]: I0320 00:10:08.995221 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w9sbs"] Mar 20 00:10:09 crc kubenswrapper[4867]: E0320 00:10:08.995929 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7921e429-29e3-4d4e-bddd-8b99537ca63f" containerName="route-controller-manager" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:08.995945 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7921e429-29e3-4d4e-bddd-8b99537ca63f" containerName="route-controller-manager" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:08.996165 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="7921e429-29e3-4d4e-bddd-8b99537ca63f" containerName="route-controller-manager" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:08.997649 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9sbs" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:08.999201 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7921e429-29e3-4d4e-bddd-8b99537ca63f-serving-cert\") pod \"7921e429-29e3-4d4e-bddd-8b99537ca63f\" (UID: \"7921e429-29e3-4d4e-bddd-8b99537ca63f\") " Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:08.999253 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szdwd\" (UniqueName: \"kubernetes.io/projected/7921e429-29e3-4d4e-bddd-8b99537ca63f-kube-api-access-szdwd\") pod \"7921e429-29e3-4d4e-bddd-8b99537ca63f\" (UID: \"7921e429-29e3-4d4e-bddd-8b99537ca63f\") " Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:08.999302 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7921e429-29e3-4d4e-bddd-8b99537ca63f-config\") pod \"7921e429-29e3-4d4e-bddd-8b99537ca63f\" (UID: \"7921e429-29e3-4d4e-bddd-8b99537ca63f\") " Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:08.999416 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:08.999522 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7921e429-29e3-4d4e-bddd-8b99537ca63f-client-ca\") pod \"7921e429-29e3-4d4e-bddd-8b99537ca63f\" (UID: \"7921e429-29e3-4d4e-bddd-8b99537ca63f\") " Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.000468 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7921e429-29e3-4d4e-bddd-8b99537ca63f-client-ca" (OuterVolumeSpecName: "client-ca") pod "7921e429-29e3-4d4e-bddd-8b99537ca63f" (UID: "7921e429-29e3-4d4e-bddd-8b99537ca63f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.000534 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7921e429-29e3-4d4e-bddd-8b99537ca63f-config" (OuterVolumeSpecName: "config") pod "7921e429-29e3-4d4e-bddd-8b99537ca63f" (UID: "7921e429-29e3-4d4e-bddd-8b99537ca63f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:10:09 crc kubenswrapper[4867]: E0320 00:10:09.000552 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:09.500537466 +0000 UTC m=+223.727074993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.002210 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.002302 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6576b87f9c-4s8b4_7921e429-29e3-4d4e-bddd-8b99537ca63f/route-controller-manager/0.log" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.002344 4867 generic.go:334] "Generic (PLEG): container finished" podID="7921e429-29e3-4d4e-bddd-8b99537ca63f" containerID="b6b8d454dd4d0e368f1f2e1d00d1265d95bd20d8170355935c27c50c3218350d" exitCode=255 Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.002407 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" event={"ID":"7921e429-29e3-4d4e-bddd-8b99537ca63f","Type":"ContainerDied","Data":"b6b8d454dd4d0e368f1f2e1d00d1265d95bd20d8170355935c27c50c3218350d"} Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.002430 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" event={"ID":"7921e429-29e3-4d4e-bddd-8b99537ca63f","Type":"ContainerDied","Data":"19757bf40b086184b3df363e0597453dc678db1c73c1605e6fa0a7a9f4292bd0"} Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.002439 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.002447 4867 scope.go:117] "RemoveContainer" containerID="b6b8d454dd4d0e368f1f2e1d00d1265d95bd20d8170355935c27c50c3218350d" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.008986 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7921e429-29e3-4d4e-bddd-8b99537ca63f-kube-api-access-szdwd" (OuterVolumeSpecName: "kube-api-access-szdwd") pod "7921e429-29e3-4d4e-bddd-8b99537ca63f" (UID: "7921e429-29e3-4d4e-bddd-8b99537ca63f"). InnerVolumeSpecName "kube-api-access-szdwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.009547 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w9sbs"] Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.020132 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7921e429-29e3-4d4e-bddd-8b99537ca63f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7921e429-29e3-4d4e-bddd-8b99537ca63f" (UID: "7921e429-29e3-4d4e-bddd-8b99537ca63f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.028138 4867 generic.go:334] "Generic (PLEG): container finished" podID="e98ce093-beb3-4d6c-be7b-8afa1e7b514e" containerID="84a42b53af3943d714a5034bb5b3cb81365f61bb89aaa6908ee21d05c116ba51" exitCode=0 Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.028187 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz" event={"ID":"e98ce093-beb3-4d6c-be7b-8afa1e7b514e","Type":"ContainerDied","Data":"84a42b53af3943d714a5034bb5b3cb81365f61bb89aaa6908ee21d05c116ba51"} Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.030916 4867 scope.go:117] "RemoveContainer" containerID="b6b8d454dd4d0e368f1f2e1d00d1265d95bd20d8170355935c27c50c3218350d" Mar 20 00:10:09 crc kubenswrapper[4867]: E0320 00:10:09.032612 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6b8d454dd4d0e368f1f2e1d00d1265d95bd20d8170355935c27c50c3218350d\": container with ID starting with b6b8d454dd4d0e368f1f2e1d00d1265d95bd20d8170355935c27c50c3218350d not found: ID does not exist" containerID="b6b8d454dd4d0e368f1f2e1d00d1265d95bd20d8170355935c27c50c3218350d" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.032643 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6b8d454dd4d0e368f1f2e1d00d1265d95bd20d8170355935c27c50c3218350d"} err="failed to get container status \"b6b8d454dd4d0e368f1f2e1d00d1265d95bd20d8170355935c27c50c3218350d\": rpc error: code = NotFound desc = could not find container \"b6b8d454dd4d0e368f1f2e1d00d1265d95bd20d8170355935c27c50c3218350d\": container with ID starting with b6b8d454dd4d0e368f1f2e1d00d1265d95bd20d8170355935c27c50c3218350d not found: ID does not exist" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.033544 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" event={"ID":"7c4d8c0e-d44d-4b0f-b4ee-1953a2fd01e7","Type":"ContainerStarted","Data":"d5a7f981f633de62de3507cb5b1509789cd8345f53af0a794dbe3e37f91a1628"} Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.045889 4867 patch_prober.go:28] interesting pod/downloads-7954f5f757-tp4c2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.045904 4867 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4fhrs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.045927 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tp4c2" podUID="3283192e-4886-4409-8a9e-dda503c25e85" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.045937 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" podUID="a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.049313 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zvttc" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.060518 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c8s9z" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.063312 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" podStartSLOduration=172.063301991 podStartE2EDuration="2m52.063301991s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:09.060678645 +0000 UTC m=+223.287216162" watchObservedRunningTime="2026-03-20 00:10:09.063301991 +0000 UTC m=+223.289839508" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.099941 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.100563 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg9hh\" (UniqueName: \"kubernetes.io/projected/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337-kube-api-access-rg9hh\") pod \"certified-operators-w9sbs\" (UID: \"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337\") " pod="openshift-marketplace/certified-operators-w9sbs" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.100807 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337-utilities\") pod \"certified-operators-w9sbs\" (UID: \"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337\") " pod="openshift-marketplace/certified-operators-w9sbs" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.100829 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337-catalog-content\") pod \"certified-operators-w9sbs\" (UID: \"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337\") " pod="openshift-marketplace/certified-operators-w9sbs" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.100883 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.110957 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7921e429-29e3-4d4e-bddd-8b99537ca63f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.110997 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szdwd\" (UniqueName: \"kubernetes.io/projected/7921e429-29e3-4d4e-bddd-8b99537ca63f-kube-api-access-szdwd\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.111008 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7921e429-29e3-4d4e-bddd-8b99537ca63f-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.111018 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7921e429-29e3-4d4e-bddd-8b99537ca63f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:09 crc kubenswrapper[4867]: E0320 00:10:09.132053 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:09.632038438 +0000 UTC m=+223.858575955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.169911 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fplqr" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.188144 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fsj5n"] Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.191142 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fsj5n" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.193633 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.212198 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.212406 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337-utilities\") pod \"certified-operators-w9sbs\" (UID: \"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337\") " pod="openshift-marketplace/certified-operators-w9sbs" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.212428 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337-catalog-content\") pod \"certified-operators-w9sbs\" (UID: \"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337\") " pod="openshift-marketplace/certified-operators-w9sbs" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.212506 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg9hh\" (UniqueName: \"kubernetes.io/projected/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337-kube-api-access-rg9hh\") pod \"certified-operators-w9sbs\" (UID: \"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337\") " pod="openshift-marketplace/certified-operators-w9sbs" Mar 20 00:10:09 crc kubenswrapper[4867]: E0320 00:10:09.212608 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:09.712588063 +0000 UTC m=+223.939125580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.212992 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337-utilities\") pod \"certified-operators-w9sbs\" (UID: \"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337\") " pod="openshift-marketplace/certified-operators-w9sbs" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.213082 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337-catalog-content\") pod \"certified-operators-w9sbs\" (UID: \"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337\") " pod="openshift-marketplace/certified-operators-w9sbs" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.228656 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fsj5n"] Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.249542 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9f5976f56-99fdm"] Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.254881 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg9hh\" (UniqueName: \"kubernetes.io/projected/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337-kube-api-access-rg9hh\") pod \"certified-operators-w9sbs\" (UID: \"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337\") " pod="openshift-marketplace/certified-operators-w9sbs" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.278612 4867 patch_prober.go:28] interesting pod/router-default-5444994796-9n97f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 00:10:09 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Mar 20 00:10:09 crc kubenswrapper[4867]: [+]process-running ok Mar 20 00:10:09 crc kubenswrapper[4867]: healthz check failed Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.278654 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9n97f" podUID="7912604b-d8ba-4510-9694-c6778f3d1c8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.323222 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5v5h\" (UniqueName: \"kubernetes.io/projected/61384875-b5b9-4757-839c-2071e973510c-kube-api-access-s5v5h\") pod \"community-operators-fsj5n\" (UID: \"61384875-b5b9-4757-839c-2071e973510c\") " pod="openshift-marketplace/community-operators-fsj5n" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.323303 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61384875-b5b9-4757-839c-2071e973510c-utilities\") pod \"community-operators-fsj5n\" (UID: \"61384875-b5b9-4757-839c-2071e973510c\") " pod="openshift-marketplace/community-operators-fsj5n" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.323319 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61384875-b5b9-4757-839c-2071e973510c-catalog-content\") pod \"community-operators-fsj5n\" (UID: \"61384875-b5b9-4757-839c-2071e973510c\") " pod="openshift-marketplace/community-operators-fsj5n" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.323367 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:09 crc kubenswrapper[4867]: E0320 00:10:09.324466 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:09.824449519 +0000 UTC m=+224.050987036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.333795 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9sbs" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.348555 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4"] Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.353011 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4s8b4"] Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.377619 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bxxsq"] Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.378580 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bxxsq" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.407738 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bxxsq"] Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.423841 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.424045 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5v5h\" (UniqueName: \"kubernetes.io/projected/61384875-b5b9-4757-839c-2071e973510c-kube-api-access-s5v5h\") pod \"community-operators-fsj5n\" (UID: \"61384875-b5b9-4757-839c-2071e973510c\") " pod="openshift-marketplace/community-operators-fsj5n" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.424073 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61384875-b5b9-4757-839c-2071e973510c-utilities\") pod \"community-operators-fsj5n\" (UID: \"61384875-b5b9-4757-839c-2071e973510c\") " pod="openshift-marketplace/community-operators-fsj5n" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.424088 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61384875-b5b9-4757-839c-2071e973510c-catalog-content\") pod \"community-operators-fsj5n\" (UID: \"61384875-b5b9-4757-839c-2071e973510c\") " pod="openshift-marketplace/community-operators-fsj5n" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.424460 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61384875-b5b9-4757-839c-2071e973510c-catalog-content\") pod \"community-operators-fsj5n\" (UID: \"61384875-b5b9-4757-839c-2071e973510c\") " pod="openshift-marketplace/community-operators-fsj5n" Mar 20 00:10:09 crc kubenswrapper[4867]: E0320 00:10:09.424556 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:09.924540398 +0000 UTC m=+224.151077915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.424983 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61384875-b5b9-4757-839c-2071e973510c-utilities\") pod \"community-operators-fsj5n\" (UID: \"61384875-b5b9-4757-839c-2071e973510c\") " pod="openshift-marketplace/community-operators-fsj5n" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.447962 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5v5h\" (UniqueName: \"kubernetes.io/projected/61384875-b5b9-4757-839c-2071e973510c-kube-api-access-s5v5h\") pod \"community-operators-fsj5n\" (UID: \"61384875-b5b9-4757-839c-2071e973510c\") " pod="openshift-marketplace/community-operators-fsj5n" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.524888 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38f94e88-d9e0-41a6-9dfe-390f2d709596-catalog-content\") pod \"certified-operators-bxxsq\" (UID: \"38f94e88-d9e0-41a6-9dfe-390f2d709596\") " pod="openshift-marketplace/certified-operators-bxxsq" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.524961 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38f94e88-d9e0-41a6-9dfe-390f2d709596-utilities\") pod \"certified-operators-bxxsq\" (UID: \"38f94e88-d9e0-41a6-9dfe-390f2d709596\") " pod="openshift-marketplace/certified-operators-bxxsq" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.525002 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.525023 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhjrb\" (UniqueName: \"kubernetes.io/projected/38f94e88-d9e0-41a6-9dfe-390f2d709596-kube-api-access-lhjrb\") pod \"certified-operators-bxxsq\" (UID: \"38f94e88-d9e0-41a6-9dfe-390f2d709596\") " pod="openshift-marketplace/certified-operators-bxxsq" Mar 20 00:10:09 crc kubenswrapper[4867]: E0320 00:10:09.525304 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:10.025292274 +0000 UTC m=+224.251829791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.536941 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fsj5n" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.583526 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-68h4b"] Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.584387 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-68h4b" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.603865 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-68h4b"] Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.628071 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:09 crc kubenswrapper[4867]: E0320 00:10:09.628222 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:10.128199404 +0000 UTC m=+224.354736921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.628334 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38f94e88-d9e0-41a6-9dfe-390f2d709596-utilities\") pod \"certified-operators-bxxsq\" (UID: \"38f94e88-d9e0-41a6-9dfe-390f2d709596\") " pod="openshift-marketplace/certified-operators-bxxsq" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.628375 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.628411 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhjrb\" (UniqueName: \"kubernetes.io/projected/38f94e88-d9e0-41a6-9dfe-390f2d709596-kube-api-access-lhjrb\") pod \"certified-operators-bxxsq\" (UID: \"38f94e88-d9e0-41a6-9dfe-390f2d709596\") " pod="openshift-marketplace/certified-operators-bxxsq" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.628463 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38f94e88-d9e0-41a6-9dfe-390f2d709596-catalog-content\") pod \"certified-operators-bxxsq\" (UID: \"38f94e88-d9e0-41a6-9dfe-390f2d709596\") " pod="openshift-marketplace/certified-operators-bxxsq" Mar 20 00:10:09 crc kubenswrapper[4867]: E0320 00:10:09.628738 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:10.128728147 +0000 UTC m=+224.355265664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.628895 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38f94e88-d9e0-41a6-9dfe-390f2d709596-catalog-content\") pod \"certified-operators-bxxsq\" (UID: \"38f94e88-d9e0-41a6-9dfe-390f2d709596\") " pod="openshift-marketplace/certified-operators-bxxsq" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.629115 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38f94e88-d9e0-41a6-9dfe-390f2d709596-utilities\") pod \"certified-operators-bxxsq\" (UID: \"38f94e88-d9e0-41a6-9dfe-390f2d709596\") " pod="openshift-marketplace/certified-operators-bxxsq" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.640013 4867 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.658309 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhjrb\" (UniqueName: \"kubernetes.io/projected/38f94e88-d9e0-41a6-9dfe-390f2d709596-kube-api-access-lhjrb\") pod \"certified-operators-bxxsq\" (UID: \"38f94e88-d9e0-41a6-9dfe-390f2d709596\") " pod="openshift-marketplace/certified-operators-bxxsq" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.729160 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.729337 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24a1654-a84e-4b71-9209-dfd7e42941d0-utilities\") pod \"community-operators-68h4b\" (UID: \"a24a1654-a84e-4b71-9209-dfd7e42941d0\") " pod="openshift-marketplace/community-operators-68h4b" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.729378 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-455wk\" (UniqueName: \"kubernetes.io/projected/a24a1654-a84e-4b71-9209-dfd7e42941d0-kube-api-access-455wk\") pod \"community-operators-68h4b\" (UID: \"a24a1654-a84e-4b71-9209-dfd7e42941d0\") " pod="openshift-marketplace/community-operators-68h4b" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.729405 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24a1654-a84e-4b71-9209-dfd7e42941d0-catalog-content\") pod \"community-operators-68h4b\" (UID: \"a24a1654-a84e-4b71-9209-dfd7e42941d0\") " pod="openshift-marketplace/community-operators-68h4b" Mar 20 00:10:09 crc kubenswrapper[4867]: E0320 00:10:09.729539 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:10.229522634 +0000 UTC m=+224.456060151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.735249 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bxxsq" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.748350 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w9sbs"] Mar 20 00:10:09 crc kubenswrapper[4867]: W0320 00:10:09.785186 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58c1b369_ec0e_43d0_a3ca_8c2b7b74d337.slice/crio-70f0593448c8127dfd3c2038c46a9a06d4aa0de28c09ab3dd181d27cc156e6a5 WatchSource:0}: Error finding container 70f0593448c8127dfd3c2038c46a9a06d4aa0de28c09ab3dd181d27cc156e6a5: Status 404 returned error can't find the container with id 70f0593448c8127dfd3c2038c46a9a06d4aa0de28c09ab3dd181d27cc156e6a5 Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.830964 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24a1654-a84e-4b71-9209-dfd7e42941d0-utilities\") pod \"community-operators-68h4b\" (UID: \"a24a1654-a84e-4b71-9209-dfd7e42941d0\") " pod="openshift-marketplace/community-operators-68h4b" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.831025 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-455wk\" (UniqueName: \"kubernetes.io/projected/a24a1654-a84e-4b71-9209-dfd7e42941d0-kube-api-access-455wk\") pod \"community-operators-68h4b\" (UID: \"a24a1654-a84e-4b71-9209-dfd7e42941d0\") " pod="openshift-marketplace/community-operators-68h4b" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.831108 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24a1654-a84e-4b71-9209-dfd7e42941d0-catalog-content\") pod \"community-operators-68h4b\" (UID: \"a24a1654-a84e-4b71-9209-dfd7e42941d0\") " pod="openshift-marketplace/community-operators-68h4b" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.831148 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:09 crc kubenswrapper[4867]: E0320 00:10:09.831427 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:10.331415728 +0000 UTC m=+224.557953245 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.831646 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24a1654-a84e-4b71-9209-dfd7e42941d0-utilities\") pod \"community-operators-68h4b\" (UID: \"a24a1654-a84e-4b71-9209-dfd7e42941d0\") " pod="openshift-marketplace/community-operators-68h4b" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.831739 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24a1654-a84e-4b71-9209-dfd7e42941d0-catalog-content\") pod \"community-operators-68h4b\" (UID: \"a24a1654-a84e-4b71-9209-dfd7e42941d0\") " pod="openshift-marketplace/community-operators-68h4b" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.858383 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-455wk\" (UniqueName: \"kubernetes.io/projected/a24a1654-a84e-4b71-9209-dfd7e42941d0-kube-api-access-455wk\") pod \"community-operators-68h4b\" (UID: \"a24a1654-a84e-4b71-9209-dfd7e42941d0\") " pod="openshift-marketplace/community-operators-68h4b" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.930140 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-68h4b" Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.934276 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fsj5n"] Mar 20 00:10:09 crc kubenswrapper[4867]: I0320 00:10:09.934681 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:09 crc kubenswrapper[4867]: E0320 00:10:09.935588 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:10.435571 +0000 UTC m=+224.662108507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.036765 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:10 crc kubenswrapper[4867]: E0320 00:10:10.037065 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:10.537052434 +0000 UTC m=+224.763589951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.075958 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" event={"ID":"d1e9196a-a19c-4c54-957f-dd4457cea9f7","Type":"ContainerStarted","Data":"79c8416aa3ebe1ac57cddc0eb1c860411a3072a3cd8b501999da7ccf5984131c"} Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.075994 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" event={"ID":"d1e9196a-a19c-4c54-957f-dd4457cea9f7","Type":"ContainerStarted","Data":"f2119e76e3c4a82219a6b73a3f4e4d9d88ad5b9917984e7c1f5bac9128f84b87"} Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.076969 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.087105 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsj5n" event={"ID":"61384875-b5b9-4757-839c-2071e973510c","Type":"ContainerStarted","Data":"170a7ed892dcd4be1cf3dfcd9718cb8365cff5852358ca07c477b59a62324da5"} Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.092345 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.094999 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7smtj" event={"ID":"774b2ca8-d862-4e99-ab70-ca68cfc5122c","Type":"ContainerStarted","Data":"0bc81604b4597f962c25538dd890b7024395ed8bd8b6c18c1d88ecc4ec50fcd1"} Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.095048 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7smtj" event={"ID":"774b2ca8-d862-4e99-ab70-ca68cfc5122c","Type":"ContainerStarted","Data":"4bff2a14e04ef0c0fb3e2d5ba4c4a1afa1d2ed970759ac5b08ac1ae83e3026e9"} Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.097067 4867 generic.go:334] "Generic (PLEG): container finished" podID="58c1b369-ec0e-43d0-a3ca-8c2b7b74d337" containerID="3ce0310b792b7d8046eef2176c5049a1431e9378502a417854886f883f3c178a" exitCode=0 Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.097199 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9sbs" event={"ID":"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337","Type":"ContainerDied","Data":"3ce0310b792b7d8046eef2176c5049a1431e9378502a417854886f883f3c178a"} Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.097243 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9sbs" event={"ID":"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337","Type":"ContainerStarted","Data":"70f0593448c8127dfd3c2038c46a9a06d4aa0de28c09ab3dd181d27cc156e6a5"} Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.106007 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.114612 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" podStartSLOduration=4.114599343 podStartE2EDuration="4.114599343s" podCreationTimestamp="2026-03-20 00:10:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:10.096738302 +0000 UTC m=+224.323275819" watchObservedRunningTime="2026-03-20 00:10:10.114599343 +0000 UTC m=+224.341136860" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.141054 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:10 crc kubenswrapper[4867]: E0320 00:10:10.141928 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:10.641902343 +0000 UTC m=+224.868439860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.143711 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:10 crc kubenswrapper[4867]: E0320 00:10:10.144668 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:10.644618732 +0000 UTC m=+224.871175009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.168160 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-68h4b"] Mar 20 00:10:10 crc kubenswrapper[4867]: W0320 00:10:10.176203 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda24a1654_a84e_4b71_9209_dfd7e42941d0.slice/crio-665fc31fadc8730f602d306c8115db9c1fddcfb53572464e756353342f85e16b WatchSource:0}: Error finding container 665fc31fadc8730f602d306c8115db9c1fddcfb53572464e756353342f85e16b: Status 404 returned error can't find the container with id 665fc31fadc8730f602d306c8115db9c1fddcfb53572464e756353342f85e16b Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.245859 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:10 crc kubenswrapper[4867]: E0320 00:10:10.246010 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:10.745982143 +0000 UTC m=+224.972519650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.246449 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:10 crc kubenswrapper[4867]: E0320 00:10:10.247151 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:10.747142602 +0000 UTC m=+224.973680119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.252911 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bxxsq"] Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.272833 4867 patch_prober.go:28] interesting pod/router-default-5444994796-9n97f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 00:10:10 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Mar 20 00:10:10 crc kubenswrapper[4867]: [+]process-running ok Mar 20 00:10:10 crc kubenswrapper[4867]: healthz check failed Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.272876 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9n97f" podUID="7912604b-d8ba-4510-9694-c6778f3d1c8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.345347 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.349182 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:10 crc kubenswrapper[4867]: E0320 00:10:10.349351 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:10.849303593 +0000 UTC m=+225.075841110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.355673 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:10 crc kubenswrapper[4867]: E0320 00:10:10.356200 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:10.856184867 +0000 UTC m=+225.082722384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.431451 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7921e429-29e3-4d4e-bddd-8b99537ca63f" path="/var/lib/kubelet/pods/7921e429-29e3-4d4e-bddd-8b99537ca63f/volumes" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.457128 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.457266 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e98ce093-beb3-4d6c-be7b-8afa1e7b514e-config-volume\") pod \"e98ce093-beb3-4d6c-be7b-8afa1e7b514e\" (UID: \"e98ce093-beb3-4d6c-be7b-8afa1e7b514e\") " Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.457309 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2w8w\" (UniqueName: \"kubernetes.io/projected/e98ce093-beb3-4d6c-be7b-8afa1e7b514e-kube-api-access-k2w8w\") pod \"e98ce093-beb3-4d6c-be7b-8afa1e7b514e\" (UID: \"e98ce093-beb3-4d6c-be7b-8afa1e7b514e\") " Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.457335 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e98ce093-beb3-4d6c-be7b-8afa1e7b514e-secret-volume\") pod \"e98ce093-beb3-4d6c-be7b-8afa1e7b514e\" (UID: \"e98ce093-beb3-4d6c-be7b-8afa1e7b514e\") " Mar 20 00:10:10 crc kubenswrapper[4867]: E0320 00:10:10.458143 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 00:10:10.958122542 +0000 UTC m=+225.184660069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.458796 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e98ce093-beb3-4d6c-be7b-8afa1e7b514e-config-volume" (OuterVolumeSpecName: "config-volume") pod "e98ce093-beb3-4d6c-be7b-8afa1e7b514e" (UID: "e98ce093-beb3-4d6c-be7b-8afa1e7b514e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.463098 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e98ce093-beb3-4d6c-be7b-8afa1e7b514e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e98ce093-beb3-4d6c-be7b-8afa1e7b514e" (UID: "e98ce093-beb3-4d6c-be7b-8afa1e7b514e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.464251 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e98ce093-beb3-4d6c-be7b-8afa1e7b514e-kube-api-access-k2w8w" (OuterVolumeSpecName: "kube-api-access-k2w8w") pod "e98ce093-beb3-4d6c-be7b-8afa1e7b514e" (UID: "e98ce093-beb3-4d6c-be7b-8afa1e7b514e"). InnerVolumeSpecName "kube-api-access-k2w8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.513957 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm"] Mar 20 00:10:10 crc kubenswrapper[4867]: E0320 00:10:10.514521 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e98ce093-beb3-4d6c-be7b-8afa1e7b514e" containerName="collect-profiles" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.514542 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e98ce093-beb3-4d6c-be7b-8afa1e7b514e" containerName="collect-profiles" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.514737 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e98ce093-beb3-4d6c-be7b-8afa1e7b514e" containerName="collect-profiles" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.515358 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.518427 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.525305 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.525459 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.526001 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.526045 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.527597 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.529748 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm"] Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.558894 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.558969 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e98ce093-beb3-4d6c-be7b-8afa1e7b514e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.558987 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2w8w\" (UniqueName: \"kubernetes.io/projected/e98ce093-beb3-4d6c-be7b-8afa1e7b514e-kube-api-access-k2w8w\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.559002 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e98ce093-beb3-4d6c-be7b-8afa1e7b514e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:10 crc kubenswrapper[4867]: E0320 00:10:10.559291 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 00:10:11.059276288 +0000 UTC m=+225.285813805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7gtj4" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.608595 4867 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T00:10:09.640031633Z","Handler":null,"Name":""} Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.612065 4867 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.612098 4867 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.659621 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.659879 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/524fe09c-29da-4222-a72f-9166c54c09ad-client-ca\") pod \"route-controller-manager-87857dc64-88cgm\" (UID: \"524fe09c-29da-4222-a72f-9166c54c09ad\") " pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.659903 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/524fe09c-29da-4222-a72f-9166c54c09ad-serving-cert\") pod \"route-controller-manager-87857dc64-88cgm\" (UID: \"524fe09c-29da-4222-a72f-9166c54c09ad\") " pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.659924 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/524fe09c-29da-4222-a72f-9166c54c09ad-config\") pod \"route-controller-manager-87857dc64-88cgm\" (UID: \"524fe09c-29da-4222-a72f-9166c54c09ad\") " pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.659952 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9nql\" (UniqueName: \"kubernetes.io/projected/524fe09c-29da-4222-a72f-9166c54c09ad-kube-api-access-q9nql\") pod \"route-controller-manager-87857dc64-88cgm\" (UID: \"524fe09c-29da-4222-a72f-9166c54c09ad\") " pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.664267 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.765224 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/524fe09c-29da-4222-a72f-9166c54c09ad-client-ca\") pod \"route-controller-manager-87857dc64-88cgm\" (UID: \"524fe09c-29da-4222-a72f-9166c54c09ad\") " pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.765265 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/524fe09c-29da-4222-a72f-9166c54c09ad-serving-cert\") pod \"route-controller-manager-87857dc64-88cgm\" (UID: \"524fe09c-29da-4222-a72f-9166c54c09ad\") " pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.765298 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.765316 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/524fe09c-29da-4222-a72f-9166c54c09ad-config\") pod \"route-controller-manager-87857dc64-88cgm\" (UID: \"524fe09c-29da-4222-a72f-9166c54c09ad\") " pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.765333 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9nql\" (UniqueName: \"kubernetes.io/projected/524fe09c-29da-4222-a72f-9166c54c09ad-kube-api-access-q9nql\") pod \"route-controller-manager-87857dc64-88cgm\" (UID: \"524fe09c-29da-4222-a72f-9166c54c09ad\") " pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.766683 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/524fe09c-29da-4222-a72f-9166c54c09ad-client-ca\") pod \"route-controller-manager-87857dc64-88cgm\" (UID: \"524fe09c-29da-4222-a72f-9166c54c09ad\") " pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.766818 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/524fe09c-29da-4222-a72f-9166c54c09ad-config\") pod \"route-controller-manager-87857dc64-88cgm\" (UID: \"524fe09c-29da-4222-a72f-9166c54c09ad\") " pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.771588 4867 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.771622 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.781555 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9nql\" (UniqueName: \"kubernetes.io/projected/524fe09c-29da-4222-a72f-9166c54c09ad-kube-api-access-q9nql\") pod \"route-controller-manager-87857dc64-88cgm\" (UID: \"524fe09c-29da-4222-a72f-9166c54c09ad\") " pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.781736 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/524fe09c-29da-4222-a72f-9166c54c09ad-serving-cert\") pod \"route-controller-manager-87857dc64-88cgm\" (UID: \"524fe09c-29da-4222-a72f-9166c54c09ad\") " pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.810459 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7gtj4\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.841290 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.870269 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.874926 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-vbfqm" Mar 20 00:10:10 crc kubenswrapper[4867]: I0320 00:10:10.962255 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.112336 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz" event={"ID":"e98ce093-beb3-4d6c-be7b-8afa1e7b514e","Type":"ContainerDied","Data":"bbf3d30864b1ad0091969fc8ac60d26812438d9f0cf3c78ac6ef1b3ea8b28f71"} Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.112591 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbf3d30864b1ad0091969fc8ac60d26812438d9f0cf3c78ac6ef1b3ea8b28f71" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.112644 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566080-jpqpz" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.115738 4867 generic.go:334] "Generic (PLEG): container finished" podID="61384875-b5b9-4757-839c-2071e973510c" containerID="f6dd7d22c930181891391f29c6f10cd35142159a758fdad9d22960b644f952e9" exitCode=0 Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.115782 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsj5n" event={"ID":"61384875-b5b9-4757-839c-2071e973510c","Type":"ContainerDied","Data":"f6dd7d22c930181891391f29c6f10cd35142159a758fdad9d22960b644f952e9"} Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.119170 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7smtj" event={"ID":"774b2ca8-d862-4e99-ab70-ca68cfc5122c","Type":"ContainerStarted","Data":"7957b85469c0a09eb65b9567bbd6e721d9d723de263d6065a3143bcfcf8c74ec"} Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.127100 4867 generic.go:334] "Generic (PLEG): container finished" podID="a24a1654-a84e-4b71-9209-dfd7e42941d0" containerID="56b0b16fdf0275108c3ef7bbe09f6499e1316c587ba3009bdb80979e63065f37" exitCode=0 Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.127157 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-68h4b" event={"ID":"a24a1654-a84e-4b71-9209-dfd7e42941d0","Type":"ContainerDied","Data":"56b0b16fdf0275108c3ef7bbe09f6499e1316c587ba3009bdb80979e63065f37"} Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.127182 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-68h4b" event={"ID":"a24a1654-a84e-4b71-9209-dfd7e42941d0","Type":"ContainerStarted","Data":"665fc31fadc8730f602d306c8115db9c1fddcfb53572464e756353342f85e16b"} Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.145123 4867 generic.go:334] "Generic (PLEG): container finished" podID="38f94e88-d9e0-41a6-9dfe-390f2d709596" containerID="3a97312c1170cf6b3cf0a4c52a1a40216422a6f4151834ce1fa7ae62c2977fb3" exitCode=0 Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.146902 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxxsq" event={"ID":"38f94e88-d9e0-41a6-9dfe-390f2d709596","Type":"ContainerDied","Data":"3a97312c1170cf6b3cf0a4c52a1a40216422a6f4151834ce1fa7ae62c2977fb3"} Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.146953 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxxsq" event={"ID":"38f94e88-d9e0-41a6-9dfe-390f2d709596","Type":"ContainerStarted","Data":"2f7805c4389bf9d61f62e9a561593a7c00385379ffcb7de6b7000182625df4cc"} Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.183752 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kjkcm"] Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.199845 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjkcm" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.203770 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.215188 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjkcm"] Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.224810 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-7smtj" podStartSLOduration=12.224791883 podStartE2EDuration="12.224791883s" podCreationTimestamp="2026-03-20 00:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:11.214962095 +0000 UTC m=+225.441499622" watchObservedRunningTime="2026-03-20 00:10:11.224791883 +0000 UTC m=+225.451329400" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.276738 4867 patch_prober.go:28] interesting pod/router-default-5444994796-9n97f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 00:10:11 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Mar 20 00:10:11 crc kubenswrapper[4867]: [+]process-running ok Mar 20 00:10:11 crc kubenswrapper[4867]: healthz check failed Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.276791 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9n97f" podUID="7912604b-d8ba-4510-9694-c6778f3d1c8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.278403 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9-utilities\") pod \"redhat-marketplace-kjkcm\" (UID: \"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9\") " pod="openshift-marketplace/redhat-marketplace-kjkcm" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.278544 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9-catalog-content\") pod \"redhat-marketplace-kjkcm\" (UID: \"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9\") " pod="openshift-marketplace/redhat-marketplace-kjkcm" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.278830 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gghsf\" (UniqueName: \"kubernetes.io/projected/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9-kube-api-access-gghsf\") pod \"redhat-marketplace-kjkcm\" (UID: \"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9\") " pod="openshift-marketplace/redhat-marketplace-kjkcm" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.383397 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9-utilities\") pod \"redhat-marketplace-kjkcm\" (UID: \"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9\") " pod="openshift-marketplace/redhat-marketplace-kjkcm" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.383451 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9-catalog-content\") pod \"redhat-marketplace-kjkcm\" (UID: \"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9\") " pod="openshift-marketplace/redhat-marketplace-kjkcm" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.383585 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gghsf\" (UniqueName: \"kubernetes.io/projected/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9-kube-api-access-gghsf\") pod \"redhat-marketplace-kjkcm\" (UID: \"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9\") " pod="openshift-marketplace/redhat-marketplace-kjkcm" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.383964 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm"] Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.384052 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9-utilities\") pod \"redhat-marketplace-kjkcm\" (UID: \"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9\") " pod="openshift-marketplace/redhat-marketplace-kjkcm" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.385818 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9-catalog-content\") pod \"redhat-marketplace-kjkcm\" (UID: \"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9\") " pod="openshift-marketplace/redhat-marketplace-kjkcm" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.401839 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gghsf\" (UniqueName: \"kubernetes.io/projected/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9-kube-api-access-gghsf\") pod \"redhat-marketplace-kjkcm\" (UID: \"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9\") " pod="openshift-marketplace/redhat-marketplace-kjkcm" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.526988 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjkcm" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.534374 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7gtj4"] Mar 20 00:10:11 crc kubenswrapper[4867]: W0320 00:10:11.554868 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5df8ee0a_f645_4f19_b75c_9cd29e21be30.slice/crio-7086d5ecefe4bbd71dfe58605d6960d1b88f04bd1e4e6ccfda82168a862e4ccd WatchSource:0}: Error finding container 7086d5ecefe4bbd71dfe58605d6960d1b88f04bd1e4e6ccfda82168a862e4ccd: Status 404 returned error can't find the container with id 7086d5ecefe4bbd71dfe58605d6960d1b88f04bd1e4e6ccfda82168a862e4ccd Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.566587 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pj4c8"] Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.567648 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pj4c8" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.570714 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pj4c8"] Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.686399 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2qm5\" (UniqueName: \"kubernetes.io/projected/33637ed0-f47f-4095-b742-dd29244de21c-kube-api-access-f2qm5\") pod \"redhat-marketplace-pj4c8\" (UID: \"33637ed0-f47f-4095-b742-dd29244de21c\") " pod="openshift-marketplace/redhat-marketplace-pj4c8" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.686809 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33637ed0-f47f-4095-b742-dd29244de21c-catalog-content\") pod \"redhat-marketplace-pj4c8\" (UID: \"33637ed0-f47f-4095-b742-dd29244de21c\") " pod="openshift-marketplace/redhat-marketplace-pj4c8" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.686850 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33637ed0-f47f-4095-b742-dd29244de21c-utilities\") pod \"redhat-marketplace-pj4c8\" (UID: \"33637ed0-f47f-4095-b742-dd29244de21c\") " pod="openshift-marketplace/redhat-marketplace-pj4c8" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.788275 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33637ed0-f47f-4095-b742-dd29244de21c-catalog-content\") pod \"redhat-marketplace-pj4c8\" (UID: \"33637ed0-f47f-4095-b742-dd29244de21c\") " pod="openshift-marketplace/redhat-marketplace-pj4c8" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.788336 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33637ed0-f47f-4095-b742-dd29244de21c-utilities\") pod \"redhat-marketplace-pj4c8\" (UID: \"33637ed0-f47f-4095-b742-dd29244de21c\") " pod="openshift-marketplace/redhat-marketplace-pj4c8" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.788383 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2qm5\" (UniqueName: \"kubernetes.io/projected/33637ed0-f47f-4095-b742-dd29244de21c-kube-api-access-f2qm5\") pod \"redhat-marketplace-pj4c8\" (UID: \"33637ed0-f47f-4095-b742-dd29244de21c\") " pod="openshift-marketplace/redhat-marketplace-pj4c8" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.788939 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33637ed0-f47f-4095-b742-dd29244de21c-catalog-content\") pod \"redhat-marketplace-pj4c8\" (UID: \"33637ed0-f47f-4095-b742-dd29244de21c\") " pod="openshift-marketplace/redhat-marketplace-pj4c8" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.789160 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33637ed0-f47f-4095-b742-dd29244de21c-utilities\") pod \"redhat-marketplace-pj4c8\" (UID: \"33637ed0-f47f-4095-b742-dd29244de21c\") " pod="openshift-marketplace/redhat-marketplace-pj4c8" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.821073 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2qm5\" (UniqueName: \"kubernetes.io/projected/33637ed0-f47f-4095-b742-dd29244de21c-kube-api-access-f2qm5\") pod \"redhat-marketplace-pj4c8\" (UID: \"33637ed0-f47f-4095-b742-dd29244de21c\") " pod="openshift-marketplace/redhat-marketplace-pj4c8" Mar 20 00:10:11 crc kubenswrapper[4867]: I0320 00:10:11.890645 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pj4c8" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.173848 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mkf6s"] Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.177761 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mkf6s" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.181676 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjkcm"] Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.181758 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.185142 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mkf6s"] Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.193639 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" event={"ID":"5df8ee0a-f645-4f19-b75c-9cd29e21be30","Type":"ContainerStarted","Data":"eb14b8164ea25462478959496409d0f650d0a4a7ffad1ac9e937014f607c00a2"} Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.193681 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" event={"ID":"5df8ee0a-f645-4f19-b75c-9cd29e21be30","Type":"ContainerStarted","Data":"7086d5ecefe4bbd71dfe58605d6960d1b88f04bd1e4e6ccfda82168a862e4ccd"} Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.194288 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.198084 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" event={"ID":"524fe09c-29da-4222-a72f-9166c54c09ad","Type":"ContainerStarted","Data":"a3aadcde959f1c2ba50d77c93430c3264a2e3431491bfc1ef46c85b170f160b5"} Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.198132 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.198148 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" event={"ID":"524fe09c-29da-4222-a72f-9166c54c09ad","Type":"ContainerStarted","Data":"3400feabcde7badaac41c319ee9d35fc48498b301cf2f51d82f41c27ff8b84a7"} Mar 20 00:10:12 crc kubenswrapper[4867]: W0320 00:10:12.200748 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b0ef29e_a6e3_486a_b6b0_ad7c2d72a5a9.slice/crio-da6862ff6d5baf84be583fff63754c8506dc27d7b0583f3ec334ab28c3f096e6 WatchSource:0}: Error finding container da6862ff6d5baf84be583fff63754c8506dc27d7b0583f3ec334ab28c3f096e6: Status 404 returned error can't find the container with id da6862ff6d5baf84be583fff63754c8506dc27d7b0583f3ec334ab28c3f096e6 Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.212206 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.271600 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" podStartSLOduration=6.2715822 podStartE2EDuration="6.2715822s" podCreationTimestamp="2026-03-20 00:10:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:12.258331195 +0000 UTC m=+226.484868712" watchObservedRunningTime="2026-03-20 00:10:12.2715822 +0000 UTC m=+226.498119717" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.284953 4867 patch_prober.go:28] interesting pod/router-default-5444994796-9n97f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 00:10:12 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Mar 20 00:10:12 crc kubenswrapper[4867]: [+]process-running ok Mar 20 00:10:12 crc kubenswrapper[4867]: healthz check failed Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.285024 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9n97f" podUID="7912604b-d8ba-4510-9694-c6778f3d1c8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.297940 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b08fdffb-6c98-4f30-b70b-1592c40e01dc-utilities\") pod \"redhat-operators-mkf6s\" (UID: \"b08fdffb-6c98-4f30-b70b-1592c40e01dc\") " pod="openshift-marketplace/redhat-operators-mkf6s" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.298006 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b08fdffb-6c98-4f30-b70b-1592c40e01dc-catalog-content\") pod \"redhat-operators-mkf6s\" (UID: \"b08fdffb-6c98-4f30-b70b-1592c40e01dc\") " pod="openshift-marketplace/redhat-operators-mkf6s" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.298073 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgw7w\" (UniqueName: \"kubernetes.io/projected/b08fdffb-6c98-4f30-b70b-1592c40e01dc-kube-api-access-mgw7w\") pod \"redhat-operators-mkf6s\" (UID: \"b08fdffb-6c98-4f30-b70b-1592c40e01dc\") " pod="openshift-marketplace/redhat-operators-mkf6s" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.384279 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" podStartSLOduration=175.384259627 podStartE2EDuration="2m55.384259627s" podCreationTimestamp="2026-03-20 00:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:12.315188302 +0000 UTC m=+226.541725829" watchObservedRunningTime="2026-03-20 00:10:12.384259627 +0000 UTC m=+226.610797144" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.388702 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pj4c8"] Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.399803 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b08fdffb-6c98-4f30-b70b-1592c40e01dc-catalog-content\") pod \"redhat-operators-mkf6s\" (UID: \"b08fdffb-6c98-4f30-b70b-1592c40e01dc\") " pod="openshift-marketplace/redhat-operators-mkf6s" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.400143 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b08fdffb-6c98-4f30-b70b-1592c40e01dc-utilities\") pod \"redhat-operators-mkf6s\" (UID: \"b08fdffb-6c98-4f30-b70b-1592c40e01dc\") " pod="openshift-marketplace/redhat-operators-mkf6s" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.400177 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgw7w\" (UniqueName: \"kubernetes.io/projected/b08fdffb-6c98-4f30-b70b-1592c40e01dc-kube-api-access-mgw7w\") pod \"redhat-operators-mkf6s\" (UID: \"b08fdffb-6c98-4f30-b70b-1592c40e01dc\") " pod="openshift-marketplace/redhat-operators-mkf6s" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.400419 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b08fdffb-6c98-4f30-b70b-1592c40e01dc-utilities\") pod \"redhat-operators-mkf6s\" (UID: \"b08fdffb-6c98-4f30-b70b-1592c40e01dc\") " pod="openshift-marketplace/redhat-operators-mkf6s" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.400705 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b08fdffb-6c98-4f30-b70b-1592c40e01dc-catalog-content\") pod \"redhat-operators-mkf6s\" (UID: \"b08fdffb-6c98-4f30-b70b-1592c40e01dc\") " pod="openshift-marketplace/redhat-operators-mkf6s" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.429468 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgw7w\" (UniqueName: \"kubernetes.io/projected/b08fdffb-6c98-4f30-b70b-1592c40e01dc-kube-api-access-mgw7w\") pod \"redhat-operators-mkf6s\" (UID: \"b08fdffb-6c98-4f30-b70b-1592c40e01dc\") " pod="openshift-marketplace/redhat-operators-mkf6s" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.450250 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.522875 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mkf6s" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.544418 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.545086 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.624062 4867 patch_prober.go:28] interesting pod/console-f9d7485db-mzkng container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.624121 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-mzkng" podUID="1656bc75-cc92-468d-b969-0fe7cb29c705" containerName="console" probeResult="failure" output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.638148 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-992sr"] Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.639234 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-992sr" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.652818 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-992sr"] Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.659346 4867 ???:1] "http: TLS handshake error from 192.168.126.11:60966: no serving certificate available for the kubelet" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.702177 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r9hf\" (UniqueName: \"kubernetes.io/projected/e04729fe-8c2e-4265-aa19-5717138937bd-kube-api-access-6r9hf\") pod \"redhat-operators-992sr\" (UID: \"e04729fe-8c2e-4265-aa19-5717138937bd\") " pod="openshift-marketplace/redhat-operators-992sr" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.702238 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04729fe-8c2e-4265-aa19-5717138937bd-catalog-content\") pod \"redhat-operators-992sr\" (UID: \"e04729fe-8c2e-4265-aa19-5717138937bd\") " pod="openshift-marketplace/redhat-operators-992sr" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.702595 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04729fe-8c2e-4265-aa19-5717138937bd-utilities\") pod \"redhat-operators-992sr\" (UID: \"e04729fe-8c2e-4265-aa19-5717138937bd\") " pod="openshift-marketplace/redhat-operators-992sr" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.769339 4867 patch_prober.go:28] interesting pod/downloads-7954f5f757-tp4c2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.769350 4867 patch_prober.go:28] interesting pod/downloads-7954f5f757-tp4c2 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.769429 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tp4c2" podUID="3283192e-4886-4409-8a9e-dda503c25e85" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.769528 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tp4c2" podUID="3283192e-4886-4409-8a9e-dda503c25e85" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.803242 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04729fe-8c2e-4265-aa19-5717138937bd-catalog-content\") pod \"redhat-operators-992sr\" (UID: \"e04729fe-8c2e-4265-aa19-5717138937bd\") " pod="openshift-marketplace/redhat-operators-992sr" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.803605 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04729fe-8c2e-4265-aa19-5717138937bd-utilities\") pod \"redhat-operators-992sr\" (UID: \"e04729fe-8c2e-4265-aa19-5717138937bd\") " pod="openshift-marketplace/redhat-operators-992sr" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.803643 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r9hf\" (UniqueName: \"kubernetes.io/projected/e04729fe-8c2e-4265-aa19-5717138937bd-kube-api-access-6r9hf\") pod \"redhat-operators-992sr\" (UID: \"e04729fe-8c2e-4265-aa19-5717138937bd\") " pod="openshift-marketplace/redhat-operators-992sr" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.804475 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04729fe-8c2e-4265-aa19-5717138937bd-utilities\") pod \"redhat-operators-992sr\" (UID: \"e04729fe-8c2e-4265-aa19-5717138937bd\") " pod="openshift-marketplace/redhat-operators-992sr" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.804813 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04729fe-8c2e-4265-aa19-5717138937bd-catalog-content\") pod \"redhat-operators-992sr\" (UID: \"e04729fe-8c2e-4265-aa19-5717138937bd\") " pod="openshift-marketplace/redhat-operators-992sr" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.845683 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r9hf\" (UniqueName: \"kubernetes.io/projected/e04729fe-8c2e-4265-aa19-5717138937bd-kube-api-access-6r9hf\") pod \"redhat-operators-992sr\" (UID: \"e04729fe-8c2e-4265-aa19-5717138937bd\") " pod="openshift-marketplace/redhat-operators-992sr" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.849511 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.849548 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.873805 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:12 crc kubenswrapper[4867]: I0320 00:10:12.927922 4867 ???:1] "http: TLS handshake error from 192.168.126.11:60978: no serving certificate available for the kubelet" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.054139 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-992sr" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.067405 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.068252 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.075802 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.076181 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.080335 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.212090 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77676f98-596b-408d-8bd9-db5fc535d8af-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"77676f98-596b-408d-8bd9-db5fc535d8af\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.212306 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77676f98-596b-408d-8bd9-db5fc535d8af-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"77676f98-596b-408d-8bd9-db5fc535d8af\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.215149 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.216043 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.218447 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.218952 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.221609 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.249339 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mkf6s"] Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.260342 4867 generic.go:334] "Generic (PLEG): container finished" podID="8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9" containerID="5a14cd334ae76bb90f0bb1e887fc5287680954c1e98e974923f70f587ca0f657" exitCode=0 Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.260422 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjkcm" event={"ID":"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9","Type":"ContainerDied","Data":"5a14cd334ae76bb90f0bb1e887fc5287680954c1e98e974923f70f587ca0f657"} Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.260449 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjkcm" event={"ID":"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9","Type":"ContainerStarted","Data":"da6862ff6d5baf84be583fff63754c8506dc27d7b0583f3ec334ab28c3f096e6"} Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.265027 4867 generic.go:334] "Generic (PLEG): container finished" podID="33637ed0-f47f-4095-b742-dd29244de21c" containerID="28b03311f900c82ca52a4216d30f3e3780e3afa70ae27d4ce2b38cdb3e8b7814" exitCode=0 Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.265717 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pj4c8" event={"ID":"33637ed0-f47f-4095-b742-dd29244de21c","Type":"ContainerDied","Data":"28b03311f900c82ca52a4216d30f3e3780e3afa70ae27d4ce2b38cdb3e8b7814"} Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.265748 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pj4c8" event={"ID":"33637ed0-f47f-4095-b742-dd29244de21c","Type":"ContainerStarted","Data":"9737c22e87bceaaf7bc4be2d4ecc0f4d49fbde1a735c07724512c79d973815da"} Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.268851 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.270821 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f98fd" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.272020 4867 patch_prober.go:28] interesting pod/router-default-5444994796-9n97f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 00:10:13 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Mar 20 00:10:13 crc kubenswrapper[4867]: [+]process-running ok Mar 20 00:10:13 crc kubenswrapper[4867]: healthz check failed Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.272045 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9n97f" podUID="7912604b-d8ba-4510-9694-c6778f3d1c8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.313128 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15e1492b-3e0d-4cec-b093-ec05b57f9a33-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15e1492b-3e0d-4cec-b093-ec05b57f9a33\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.313186 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15e1492b-3e0d-4cec-b093-ec05b57f9a33-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15e1492b-3e0d-4cec-b093-ec05b57f9a33\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.313218 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77676f98-596b-408d-8bd9-db5fc535d8af-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"77676f98-596b-408d-8bd9-db5fc535d8af\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.313240 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77676f98-596b-408d-8bd9-db5fc535d8af-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"77676f98-596b-408d-8bd9-db5fc535d8af\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.313304 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77676f98-596b-408d-8bd9-db5fc535d8af-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"77676f98-596b-408d-8bd9-db5fc535d8af\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.346708 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77676f98-596b-408d-8bd9-db5fc535d8af-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"77676f98-596b-408d-8bd9-db5fc535d8af\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.387844 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.414331 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15e1492b-3e0d-4cec-b093-ec05b57f9a33-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15e1492b-3e0d-4cec-b093-ec05b57f9a33\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.414549 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15e1492b-3e0d-4cec-b093-ec05b57f9a33-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15e1492b-3e0d-4cec-b093-ec05b57f9a33\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.421267 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15e1492b-3e0d-4cec-b093-ec05b57f9a33-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15e1492b-3e0d-4cec-b093-ec05b57f9a33\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.454771 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15e1492b-3e0d-4cec-b093-ec05b57f9a33-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15e1492b-3e0d-4cec-b093-ec05b57f9a33\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.545211 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 00:10:13 crc kubenswrapper[4867]: I0320 00:10:13.993868 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-992sr"] Mar 20 00:10:14 crc kubenswrapper[4867]: W0320 00:10:14.042912 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode04729fe_8c2e_4265_aa19_5717138937bd.slice/crio-bdbaa04290888a18d861cc38bd2fcb59fdb2b0eaf117dedfdf44a6c739befed9 WatchSource:0}: Error finding container bdbaa04290888a18d861cc38bd2fcb59fdb2b0eaf117dedfdf44a6c739befed9: Status 404 returned error can't find the container with id bdbaa04290888a18d861cc38bd2fcb59fdb2b0eaf117dedfdf44a6c739befed9 Mar 20 00:10:14 crc kubenswrapper[4867]: I0320 00:10:14.048686 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 00:10:14 crc kubenswrapper[4867]: I0320 00:10:14.275125 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:14 crc kubenswrapper[4867]: I0320 00:10:14.275426 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-992sr" event={"ID":"e04729fe-8c2e-4265-aa19-5717138937bd","Type":"ContainerStarted","Data":"bdbaa04290888a18d861cc38bd2fcb59fdb2b0eaf117dedfdf44a6c739befed9"} Mar 20 00:10:14 crc kubenswrapper[4867]: I0320 00:10:14.288882 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"77676f98-596b-408d-8bd9-db5fc535d8af","Type":"ContainerStarted","Data":"4674c657e7b8e2a999cce8c9c5796a35b6a173518bc21c11cee933089da3de00"} Mar 20 00:10:14 crc kubenswrapper[4867]: I0320 00:10:14.290524 4867 generic.go:334] "Generic (PLEG): container finished" podID="b08fdffb-6c98-4f30-b70b-1592c40e01dc" containerID="ec1044df2b6ac042c2fff456d04abeb4e2136a155d535fc91210fcca6ef9c5f8" exitCode=0 Mar 20 00:10:14 crc kubenswrapper[4867]: I0320 00:10:14.290667 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkf6s" event={"ID":"b08fdffb-6c98-4f30-b70b-1592c40e01dc","Type":"ContainerDied","Data":"ec1044df2b6ac042c2fff456d04abeb4e2136a155d535fc91210fcca6ef9c5f8"} Mar 20 00:10:14 crc kubenswrapper[4867]: I0320 00:10:14.290714 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkf6s" event={"ID":"b08fdffb-6c98-4f30-b70b-1592c40e01dc","Type":"ContainerStarted","Data":"e4ab5e6e53da4a0456e3f75fc9882b818273c81331f7388430274d4a5f5eb8ee"} Mar 20 00:10:14 crc kubenswrapper[4867]: I0320 00:10:14.293611 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-9n97f" Mar 20 00:10:14 crc kubenswrapper[4867]: I0320 00:10:14.444586 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 00:10:15 crc kubenswrapper[4867]: I0320 00:10:15.099862 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-z88lc" Mar 20 00:10:15 crc kubenswrapper[4867]: I0320 00:10:15.300182 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"77676f98-596b-408d-8bd9-db5fc535d8af","Type":"ContainerStarted","Data":"e0789663c05035309154548dc7078300bc9ca0b72edcbd0447e567190f8312df"} Mar 20 00:10:15 crc kubenswrapper[4867]: I0320 00:10:15.303449 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"15e1492b-3e0d-4cec-b093-ec05b57f9a33","Type":"ContainerStarted","Data":"bfefda82aa5c817f5a21aab9830e012058bfec1625bc4e38f3535074b4943ada"} Mar 20 00:10:15 crc kubenswrapper[4867]: I0320 00:10:15.303505 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"15e1492b-3e0d-4cec-b093-ec05b57f9a33","Type":"ContainerStarted","Data":"fd41a768ca22d65e0aeb981cf7d762ea0fbb0382f3dc2543a398923f8e068d30"} Mar 20 00:10:15 crc kubenswrapper[4867]: I0320 00:10:15.306527 4867 generic.go:334] "Generic (PLEG): container finished" podID="e04729fe-8c2e-4265-aa19-5717138937bd" containerID="56eb3b7da1c9ec6e698bf0ccfbee9f60a2fca29c46c68d5aea03de29e5877249" exitCode=0 Mar 20 00:10:15 crc kubenswrapper[4867]: I0320 00:10:15.306565 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-992sr" event={"ID":"e04729fe-8c2e-4265-aa19-5717138937bd","Type":"ContainerDied","Data":"56eb3b7da1c9ec6e698bf0ccfbee9f60a2fca29c46c68d5aea03de29e5877249"} Mar 20 00:10:15 crc kubenswrapper[4867]: I0320 00:10:15.325079 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.325061568 podStartE2EDuration="2.325061568s" podCreationTimestamp="2026-03-20 00:10:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:15.323978381 +0000 UTC m=+229.550515898" watchObservedRunningTime="2026-03-20 00:10:15.325061568 +0000 UTC m=+229.551599085" Mar 20 00:10:15 crc kubenswrapper[4867]: I0320 00:10:15.354157 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.354140533 podStartE2EDuration="2.354140533s" podCreationTimestamp="2026-03-20 00:10:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:15.336761304 +0000 UTC m=+229.563298831" watchObservedRunningTime="2026-03-20 00:10:15.354140533 +0000 UTC m=+229.580678060" Mar 20 00:10:16 crc kubenswrapper[4867]: I0320 00:10:16.337688 4867 generic.go:334] "Generic (PLEG): container finished" podID="15e1492b-3e0d-4cec-b093-ec05b57f9a33" containerID="bfefda82aa5c817f5a21aab9830e012058bfec1625bc4e38f3535074b4943ada" exitCode=0 Mar 20 00:10:16 crc kubenswrapper[4867]: I0320 00:10:16.337910 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"15e1492b-3e0d-4cec-b093-ec05b57f9a33","Type":"ContainerDied","Data":"bfefda82aa5c817f5a21aab9830e012058bfec1625bc4e38f3535074b4943ada"} Mar 20 00:10:16 crc kubenswrapper[4867]: I0320 00:10:16.340397 4867 generic.go:334] "Generic (PLEG): container finished" podID="77676f98-596b-408d-8bd9-db5fc535d8af" containerID="e0789663c05035309154548dc7078300bc9ca0b72edcbd0447e567190f8312df" exitCode=0 Mar 20 00:10:16 crc kubenswrapper[4867]: I0320 00:10:16.340431 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"77676f98-596b-408d-8bd9-db5fc535d8af","Type":"ContainerDied","Data":"e0789663c05035309154548dc7078300bc9ca0b72edcbd0447e567190f8312df"} Mar 20 00:10:18 crc kubenswrapper[4867]: I0320 00:10:18.860852 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:10:18 crc kubenswrapper[4867]: I0320 00:10:18.861289 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:10:18 crc kubenswrapper[4867]: I0320 00:10:18.887327 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:10:22 crc kubenswrapper[4867]: I0320 00:10:22.551903 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:22 crc kubenswrapper[4867]: I0320 00:10:22.558860 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-mzkng" Mar 20 00:10:22 crc kubenswrapper[4867]: I0320 00:10:22.769678 4867 patch_prober.go:28] interesting pod/downloads-7954f5f757-tp4c2 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 20 00:10:22 crc kubenswrapper[4867]: I0320 00:10:22.770042 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tp4c2" podUID="3283192e-4886-4409-8a9e-dda503c25e85" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 20 00:10:22 crc kubenswrapper[4867]: I0320 00:10:22.769800 4867 patch_prober.go:28] interesting pod/downloads-7954f5f757-tp4c2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 20 00:10:22 crc kubenswrapper[4867]: I0320 00:10:22.770098 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tp4c2" podUID="3283192e-4886-4409-8a9e-dda503c25e85" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 20 00:10:22 crc kubenswrapper[4867]: I0320 00:10:22.929647 4867 ???:1] "http: TLS handshake error from 192.168.126.11:34222: no serving certificate available for the kubelet" Mar 20 00:10:25 crc kubenswrapper[4867]: I0320 00:10:25.927044 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9f5976f56-99fdm"] Mar 20 00:10:25 crc kubenswrapper[4867]: I0320 00:10:25.927381 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" podUID="d1e9196a-a19c-4c54-957f-dd4457cea9f7" containerName="controller-manager" containerID="cri-o://79c8416aa3ebe1ac57cddc0eb1c860411a3072a3cd8b501999da7ccf5984131c" gracePeriod=30 Mar 20 00:10:25 crc kubenswrapper[4867]: I0320 00:10:25.942086 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm"] Mar 20 00:10:25 crc kubenswrapper[4867]: I0320 00:10:25.942790 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" podUID="524fe09c-29da-4222-a72f-9166c54c09ad" containerName="route-controller-manager" containerID="cri-o://a3aadcde959f1c2ba50d77c93430c3264a2e3431491bfc1ef46c85b170f160b5" gracePeriod=30 Mar 20 00:10:26 crc kubenswrapper[4867]: I0320 00:10:26.475990 4867 generic.go:334] "Generic (PLEG): container finished" podID="524fe09c-29da-4222-a72f-9166c54c09ad" containerID="a3aadcde959f1c2ba50d77c93430c3264a2e3431491bfc1ef46c85b170f160b5" exitCode=0 Mar 20 00:10:26 crc kubenswrapper[4867]: I0320 00:10:26.476068 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" event={"ID":"524fe09c-29da-4222-a72f-9166c54c09ad","Type":"ContainerDied","Data":"a3aadcde959f1c2ba50d77c93430c3264a2e3431491bfc1ef46c85b170f160b5"} Mar 20 00:10:26 crc kubenswrapper[4867]: I0320 00:10:26.659833 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 00:10:26 crc kubenswrapper[4867]: I0320 00:10:26.905653 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 00:10:27 crc kubenswrapper[4867]: I0320 00:10:27.066577 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15e1492b-3e0d-4cec-b093-ec05b57f9a33-kubelet-dir\") pod \"15e1492b-3e0d-4cec-b093-ec05b57f9a33\" (UID: \"15e1492b-3e0d-4cec-b093-ec05b57f9a33\") " Mar 20 00:10:27 crc kubenswrapper[4867]: I0320 00:10:27.066673 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15e1492b-3e0d-4cec-b093-ec05b57f9a33-kube-api-access\") pod \"15e1492b-3e0d-4cec-b093-ec05b57f9a33\" (UID: \"15e1492b-3e0d-4cec-b093-ec05b57f9a33\") " Mar 20 00:10:27 crc kubenswrapper[4867]: I0320 00:10:27.066709 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15e1492b-3e0d-4cec-b093-ec05b57f9a33-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "15e1492b-3e0d-4cec-b093-ec05b57f9a33" (UID: "15e1492b-3e0d-4cec-b093-ec05b57f9a33"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:10:27 crc kubenswrapper[4867]: I0320 00:10:27.066901 4867 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15e1492b-3e0d-4cec-b093-ec05b57f9a33-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:27 crc kubenswrapper[4867]: I0320 00:10:27.083653 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15e1492b-3e0d-4cec-b093-ec05b57f9a33-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "15e1492b-3e0d-4cec-b093-ec05b57f9a33" (UID: "15e1492b-3e0d-4cec-b093-ec05b57f9a33"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:10:27 crc kubenswrapper[4867]: I0320 00:10:27.168475 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15e1492b-3e0d-4cec-b093-ec05b57f9a33-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:27 crc kubenswrapper[4867]: I0320 00:10:27.484323 4867 generic.go:334] "Generic (PLEG): container finished" podID="d1e9196a-a19c-4c54-957f-dd4457cea9f7" containerID="79c8416aa3ebe1ac57cddc0eb1c860411a3072a3cd8b501999da7ccf5984131c" exitCode=0 Mar 20 00:10:27 crc kubenswrapper[4867]: I0320 00:10:27.484445 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" event={"ID":"d1e9196a-a19c-4c54-957f-dd4457cea9f7","Type":"ContainerDied","Data":"79c8416aa3ebe1ac57cddc0eb1c860411a3072a3cd8b501999da7ccf5984131c"} Mar 20 00:10:27 crc kubenswrapper[4867]: I0320 00:10:27.486768 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"15e1492b-3e0d-4cec-b093-ec05b57f9a33","Type":"ContainerDied","Data":"fd41a768ca22d65e0aeb981cf7d762ea0fbb0382f3dc2543a398923f8e068d30"} Mar 20 00:10:27 crc kubenswrapper[4867]: I0320 00:10:27.486809 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 00:10:27 crc kubenswrapper[4867]: I0320 00:10:27.486821 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd41a768ca22d65e0aeb981cf7d762ea0fbb0382f3dc2543a398923f8e068d30" Mar 20 00:10:27 crc kubenswrapper[4867]: I0320 00:10:27.920219 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 00:10:28 crc kubenswrapper[4867]: I0320 00:10:28.080139 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77676f98-596b-408d-8bd9-db5fc535d8af-kube-api-access\") pod \"77676f98-596b-408d-8bd9-db5fc535d8af\" (UID: \"77676f98-596b-408d-8bd9-db5fc535d8af\") " Mar 20 00:10:28 crc kubenswrapper[4867]: I0320 00:10:28.080414 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77676f98-596b-408d-8bd9-db5fc535d8af-kubelet-dir\") pod \"77676f98-596b-408d-8bd9-db5fc535d8af\" (UID: \"77676f98-596b-408d-8bd9-db5fc535d8af\") " Mar 20 00:10:28 crc kubenswrapper[4867]: I0320 00:10:28.080767 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77676f98-596b-408d-8bd9-db5fc535d8af-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "77676f98-596b-408d-8bd9-db5fc535d8af" (UID: "77676f98-596b-408d-8bd9-db5fc535d8af"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:10:28 crc kubenswrapper[4867]: I0320 00:10:28.099369 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77676f98-596b-408d-8bd9-db5fc535d8af-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "77676f98-596b-408d-8bd9-db5fc535d8af" (UID: "77676f98-596b-408d-8bd9-db5fc535d8af"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:10:28 crc kubenswrapper[4867]: I0320 00:10:28.181712 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77676f98-596b-408d-8bd9-db5fc535d8af-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:28 crc kubenswrapper[4867]: I0320 00:10:28.181746 4867 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77676f98-596b-408d-8bd9-db5fc535d8af-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:28 crc kubenswrapper[4867]: I0320 00:10:28.493833 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"77676f98-596b-408d-8bd9-db5fc535d8af","Type":"ContainerDied","Data":"4674c657e7b8e2a999cce8c9c5796a35b6a173518bc21c11cee933089da3de00"} Mar 20 00:10:28 crc kubenswrapper[4867]: I0320 00:10:28.493873 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4674c657e7b8e2a999cce8c9c5796a35b6a173518bc21c11cee933089da3de00" Mar 20 00:10:28 crc kubenswrapper[4867]: I0320 00:10:28.493883 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 00:10:29 crc kubenswrapper[4867]: I0320 00:10:29.831261 4867 patch_prober.go:28] interesting pod/controller-manager-9f5976f56-99fdm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 00:10:29 crc kubenswrapper[4867]: I0320 00:10:29.831716 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" podUID="d1e9196a-a19c-4c54-957f-dd4457cea9f7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 00:10:30 crc kubenswrapper[4867]: I0320 00:10:30.968915 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:10:31 crc kubenswrapper[4867]: I0320 00:10:31.843354 4867 patch_prober.go:28] interesting pod/route-controller-manager-87857dc64-88cgm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 00:10:31 crc kubenswrapper[4867]: I0320 00:10:31.843642 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" podUID="524fe09c-29da-4222-a72f-9166c54c09ad" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 00:10:32 crc kubenswrapper[4867]: I0320 00:10:32.784641 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-tp4c2" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.743280 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.748838 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.800352 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-68664d8549-l9fb5"] Mar 20 00:10:34 crc kubenswrapper[4867]: E0320 00:10:34.800817 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77676f98-596b-408d-8bd9-db5fc535d8af" containerName="pruner" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.800846 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="77676f98-596b-408d-8bd9-db5fc535d8af" containerName="pruner" Mar 20 00:10:34 crc kubenswrapper[4867]: E0320 00:10:34.800864 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e9196a-a19c-4c54-957f-dd4457cea9f7" containerName="controller-manager" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.800876 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e9196a-a19c-4c54-957f-dd4457cea9f7" containerName="controller-manager" Mar 20 00:10:34 crc kubenswrapper[4867]: E0320 00:10:34.800891 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e1492b-3e0d-4cec-b093-ec05b57f9a33" containerName="pruner" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.800901 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e1492b-3e0d-4cec-b093-ec05b57f9a33" containerName="pruner" Mar 20 00:10:34 crc kubenswrapper[4867]: E0320 00:10:34.800931 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524fe09c-29da-4222-a72f-9166c54c09ad" containerName="route-controller-manager" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.800941 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="524fe09c-29da-4222-a72f-9166c54c09ad" containerName="route-controller-manager" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.801107 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e1492b-3e0d-4cec-b093-ec05b57f9a33" containerName="pruner" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.801125 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="77676f98-596b-408d-8bd9-db5fc535d8af" containerName="pruner" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.801140 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1e9196a-a19c-4c54-957f-dd4457cea9f7" containerName="controller-manager" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.801155 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="524fe09c-29da-4222-a72f-9166c54c09ad" containerName="route-controller-manager" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.801788 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.808359 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68664d8549-l9fb5"] Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.880626 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1e9196a-a19c-4c54-957f-dd4457cea9f7-client-ca\") pod \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.880745 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/524fe09c-29da-4222-a72f-9166c54c09ad-serving-cert\") pod \"524fe09c-29da-4222-a72f-9166c54c09ad\" (UID: \"524fe09c-29da-4222-a72f-9166c54c09ad\") " Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.880774 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9nql\" (UniqueName: \"kubernetes.io/projected/524fe09c-29da-4222-a72f-9166c54c09ad-kube-api-access-q9nql\") pod \"524fe09c-29da-4222-a72f-9166c54c09ad\" (UID: \"524fe09c-29da-4222-a72f-9166c54c09ad\") " Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.880813 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1e9196a-a19c-4c54-957f-dd4457cea9f7-proxy-ca-bundles\") pod \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.880842 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1e9196a-a19c-4c54-957f-dd4457cea9f7-serving-cert\") pod \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.880909 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/524fe09c-29da-4222-a72f-9166c54c09ad-config\") pod \"524fe09c-29da-4222-a72f-9166c54c09ad\" (UID: \"524fe09c-29da-4222-a72f-9166c54c09ad\") " Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.880945 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/524fe09c-29da-4222-a72f-9166c54c09ad-client-ca\") pod \"524fe09c-29da-4222-a72f-9166c54c09ad\" (UID: \"524fe09c-29da-4222-a72f-9166c54c09ad\") " Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.880980 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n44s7\" (UniqueName: \"kubernetes.io/projected/d1e9196a-a19c-4c54-957f-dd4457cea9f7-kube-api-access-n44s7\") pod \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.881003 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1e9196a-a19c-4c54-957f-dd4457cea9f7-config\") pod \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\" (UID: \"d1e9196a-a19c-4c54-957f-dd4457cea9f7\") " Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.881161 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7067225-d7e6-4e0e-b83a-c610db059f9a-serving-cert\") pod \"controller-manager-68664d8549-l9fb5\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.881284 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7067225-d7e6-4e0e-b83a-c610db059f9a-config\") pod \"controller-manager-68664d8549-l9fb5\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.881310 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7067225-d7e6-4e0e-b83a-c610db059f9a-client-ca\") pod \"controller-manager-68664d8549-l9fb5\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.881373 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7067225-d7e6-4e0e-b83a-c610db059f9a-proxy-ca-bundles\") pod \"controller-manager-68664d8549-l9fb5\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.881445 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jjns\" (UniqueName: \"kubernetes.io/projected/e7067225-d7e6-4e0e-b83a-c610db059f9a-kube-api-access-6jjns\") pod \"controller-manager-68664d8549-l9fb5\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.881472 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1e9196a-a19c-4c54-957f-dd4457cea9f7-client-ca" (OuterVolumeSpecName: "client-ca") pod "d1e9196a-a19c-4c54-957f-dd4457cea9f7" (UID: "d1e9196a-a19c-4c54-957f-dd4457cea9f7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.881698 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/524fe09c-29da-4222-a72f-9166c54c09ad-client-ca" (OuterVolumeSpecName: "client-ca") pod "524fe09c-29da-4222-a72f-9166c54c09ad" (UID: "524fe09c-29da-4222-a72f-9166c54c09ad"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.881743 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1e9196a-a19c-4c54-957f-dd4457cea9f7-config" (OuterVolumeSpecName: "config") pod "d1e9196a-a19c-4c54-957f-dd4457cea9f7" (UID: "d1e9196a-a19c-4c54-957f-dd4457cea9f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.882026 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1e9196a-a19c-4c54-957f-dd4457cea9f7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d1e9196a-a19c-4c54-957f-dd4457cea9f7" (UID: "d1e9196a-a19c-4c54-957f-dd4457cea9f7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.882245 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/524fe09c-29da-4222-a72f-9166c54c09ad-config" (OuterVolumeSpecName: "config") pod "524fe09c-29da-4222-a72f-9166c54c09ad" (UID: "524fe09c-29da-4222-a72f-9166c54c09ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.886295 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e9196a-a19c-4c54-957f-dd4457cea9f7-kube-api-access-n44s7" (OuterVolumeSpecName: "kube-api-access-n44s7") pod "d1e9196a-a19c-4c54-957f-dd4457cea9f7" (UID: "d1e9196a-a19c-4c54-957f-dd4457cea9f7"). InnerVolumeSpecName "kube-api-access-n44s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.886680 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524fe09c-29da-4222-a72f-9166c54c09ad-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "524fe09c-29da-4222-a72f-9166c54c09ad" (UID: "524fe09c-29da-4222-a72f-9166c54c09ad"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.886691 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e9196a-a19c-4c54-957f-dd4457cea9f7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d1e9196a-a19c-4c54-957f-dd4457cea9f7" (UID: "d1e9196a-a19c-4c54-957f-dd4457cea9f7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.887382 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/524fe09c-29da-4222-a72f-9166c54c09ad-kube-api-access-q9nql" (OuterVolumeSpecName: "kube-api-access-q9nql") pod "524fe09c-29da-4222-a72f-9166c54c09ad" (UID: "524fe09c-29da-4222-a72f-9166c54c09ad"). InnerVolumeSpecName "kube-api-access-q9nql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.982671 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7067225-d7e6-4e0e-b83a-c610db059f9a-config\") pod \"controller-manager-68664d8549-l9fb5\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.982769 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7067225-d7e6-4e0e-b83a-c610db059f9a-client-ca\") pod \"controller-manager-68664d8549-l9fb5\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.982801 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jjns\" (UniqueName: \"kubernetes.io/projected/e7067225-d7e6-4e0e-b83a-c610db059f9a-kube-api-access-6jjns\") pod \"controller-manager-68664d8549-l9fb5\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.982825 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7067225-d7e6-4e0e-b83a-c610db059f9a-proxy-ca-bundles\") pod \"controller-manager-68664d8549-l9fb5\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.982853 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7067225-d7e6-4e0e-b83a-c610db059f9a-serving-cert\") pod \"controller-manager-68664d8549-l9fb5\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.983005 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/524fe09c-29da-4222-a72f-9166c54c09ad-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.983026 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/524fe09c-29da-4222-a72f-9166c54c09ad-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.983043 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n44s7\" (UniqueName: \"kubernetes.io/projected/d1e9196a-a19c-4c54-957f-dd4457cea9f7-kube-api-access-n44s7\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.983061 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1e9196a-a19c-4c54-957f-dd4457cea9f7-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.983073 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1e9196a-a19c-4c54-957f-dd4457cea9f7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.983084 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/524fe09c-29da-4222-a72f-9166c54c09ad-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.983126 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9nql\" (UniqueName: \"kubernetes.io/projected/524fe09c-29da-4222-a72f-9166c54c09ad-kube-api-access-q9nql\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.983140 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1e9196a-a19c-4c54-957f-dd4457cea9f7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.983154 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1e9196a-a19c-4c54-957f-dd4457cea9f7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.984536 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7067225-d7e6-4e0e-b83a-c610db059f9a-client-ca\") pod \"controller-manager-68664d8549-l9fb5\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.985006 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7067225-d7e6-4e0e-b83a-c610db059f9a-proxy-ca-bundles\") pod \"controller-manager-68664d8549-l9fb5\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.985174 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7067225-d7e6-4e0e-b83a-c610db059f9a-config\") pod \"controller-manager-68664d8549-l9fb5\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:34 crc kubenswrapper[4867]: I0320 00:10:34.986615 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7067225-d7e6-4e0e-b83a-c610db059f9a-serving-cert\") pod \"controller-manager-68664d8549-l9fb5\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:35 crc kubenswrapper[4867]: I0320 00:10:35.003580 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jjns\" (UniqueName: \"kubernetes.io/projected/e7067225-d7e6-4e0e-b83a-c610db059f9a-kube-api-access-6jjns\") pod \"controller-manager-68664d8549-l9fb5\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:35 crc kubenswrapper[4867]: I0320 00:10:35.115850 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:35 crc kubenswrapper[4867]: I0320 00:10:35.528507 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" event={"ID":"524fe09c-29da-4222-a72f-9166c54c09ad","Type":"ContainerDied","Data":"3400feabcde7badaac41c319ee9d35fc48498b301cf2f51d82f41c27ff8b84a7"} Mar 20 00:10:35 crc kubenswrapper[4867]: I0320 00:10:35.528543 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm" Mar 20 00:10:35 crc kubenswrapper[4867]: I0320 00:10:35.528565 4867 scope.go:117] "RemoveContainer" containerID="a3aadcde959f1c2ba50d77c93430c3264a2e3431491bfc1ef46c85b170f160b5" Mar 20 00:10:35 crc kubenswrapper[4867]: I0320 00:10:35.531290 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" event={"ID":"d1e9196a-a19c-4c54-957f-dd4457cea9f7","Type":"ContainerDied","Data":"f2119e76e3c4a82219a6b73a3f4e4d9d88ad5b9917984e7c1f5bac9128f84b87"} Mar 20 00:10:35 crc kubenswrapper[4867]: I0320 00:10:35.531376 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9f5976f56-99fdm" Mar 20 00:10:35 crc kubenswrapper[4867]: I0320 00:10:35.553557 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm"] Mar 20 00:10:35 crc kubenswrapper[4867]: I0320 00:10:35.557886 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-87857dc64-88cgm"] Mar 20 00:10:35 crc kubenswrapper[4867]: I0320 00:10:35.568310 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9f5976f56-99fdm"] Mar 20 00:10:35 crc kubenswrapper[4867]: I0320 00:10:35.572269 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-9f5976f56-99fdm"] Mar 20 00:10:35 crc kubenswrapper[4867]: E0320 00:10:35.775353 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 00:10:35 crc kubenswrapper[4867]: E0320 00:10:35.775485 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:10:35 crc kubenswrapper[4867]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 00:10:35 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bms72,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29566088-pbdgz_openshift-infra(e91db560-3692-4697-bbf0-3c5a8438d5e5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 00:10:35 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:10:35 crc kubenswrapper[4867]: E0320 00:10:35.776685 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29566088-pbdgz" podUID="e91db560-3692-4697-bbf0-3c5a8438d5e5" Mar 20 00:10:36 crc kubenswrapper[4867]: I0320 00:10:36.429136 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="524fe09c-29da-4222-a72f-9166c54c09ad" path="/var/lib/kubelet/pods/524fe09c-29da-4222-a72f-9166c54c09ad/volumes" Mar 20 00:10:36 crc kubenswrapper[4867]: I0320 00:10:36.429866 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1e9196a-a19c-4c54-957f-dd4457cea9f7" path="/var/lib/kubelet/pods/d1e9196a-a19c-4c54-957f-dd4457cea9f7/volumes" Mar 20 00:10:36 crc kubenswrapper[4867]: E0320 00:10:36.537003 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29566088-pbdgz" podUID="e91db560-3692-4697-bbf0-3c5a8438d5e5" Mar 20 00:10:38 crc kubenswrapper[4867]: E0320 00:10:38.575524 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 00:10:38 crc kubenswrapper[4867]: E0320 00:10:38.575882 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 00:10:38 crc kubenswrapper[4867]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 00:10:38 crc kubenswrapper[4867]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sm7lb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29566090-fczg6_openshift-infra(a4b1008c-a714-49a2-8c17-e971afc302af): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 00:10:38 crc kubenswrapper[4867]: > logger="UnhandledError" Mar 20 00:10:38 crc kubenswrapper[4867]: E0320 00:10:38.577125 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29566090-fczg6" podUID="a4b1008c-a714-49a2-8c17-e971afc302af" Mar 20 00:10:39 crc kubenswrapper[4867]: E0320 00:10:39.556790 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29566090-fczg6" podUID="a4b1008c-a714-49a2-8c17-e971afc302af" Mar 20 00:10:41 crc kubenswrapper[4867]: I0320 00:10:41.567790 4867 generic.go:334] "Generic (PLEG): container finished" podID="c74119c8-4cf1-4c7b-9dba-49421dfe52e6" containerID="c79a3591c07bd56445fbc0490fd74c3358c507e92900bf951e368bc70d944d72" exitCode=0 Mar 20 00:10:41 crc kubenswrapper[4867]: I0320 00:10:41.567926 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29566080-vq54z" event={"ID":"c74119c8-4cf1-4c7b-9dba-49421dfe52e6","Type":"ContainerDied","Data":"c79a3591c07bd56445fbc0490fd74c3358c507e92900bf951e368bc70d944d72"} Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.039855 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xd2f2" Mar 20 00:10:43 crc kubenswrapper[4867]: E0320 00:10:43.376830 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 00:10:43 crc kubenswrapper[4867]: E0320 00:10:43.377163 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-455wk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-68h4b_openshift-marketplace(a24a1654-a84e-4b71-9209-dfd7e42941d0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 00:10:43 crc kubenswrapper[4867]: E0320 00:10:43.378601 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-68h4b" podUID="a24a1654-a84e-4b71-9209-dfd7e42941d0" Mar 20 00:10:43 crc kubenswrapper[4867]: E0320 00:10:43.461053 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 00:10:43 crc kubenswrapper[4867]: E0320 00:10:43.461271 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6r9hf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-992sr_openshift-marketplace(e04729fe-8c2e-4265-aa19-5717138937bd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 00:10:43 crc kubenswrapper[4867]: E0320 00:10:43.462461 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-992sr" podUID="e04729fe-8c2e-4265-aa19-5717138937bd" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.539530 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2"] Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.540170 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.541864 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.542130 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.542334 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.542619 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.542778 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.550479 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.554097 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2"] Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.660678 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a8c44c6-a3f8-4676-90a5-79629327c1a0-client-ca\") pod \"route-controller-manager-5d48f6d844-vbbz2\" (UID: \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\") " pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.661107 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a8c44c6-a3f8-4676-90a5-79629327c1a0-serving-cert\") pod \"route-controller-manager-5d48f6d844-vbbz2\" (UID: \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\") " pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.661244 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a8c44c6-a3f8-4676-90a5-79629327c1a0-config\") pod \"route-controller-manager-5d48f6d844-vbbz2\" (UID: \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\") " pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.661335 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjk7s\" (UniqueName: \"kubernetes.io/projected/1a8c44c6-a3f8-4676-90a5-79629327c1a0-kube-api-access-bjk7s\") pod \"route-controller-manager-5d48f6d844-vbbz2\" (UID: \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\") " pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.762777 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a8c44c6-a3f8-4676-90a5-79629327c1a0-serving-cert\") pod \"route-controller-manager-5d48f6d844-vbbz2\" (UID: \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\") " pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.762870 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a8c44c6-a3f8-4676-90a5-79629327c1a0-config\") pod \"route-controller-manager-5d48f6d844-vbbz2\" (UID: \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\") " pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.762913 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjk7s\" (UniqueName: \"kubernetes.io/projected/1a8c44c6-a3f8-4676-90a5-79629327c1a0-kube-api-access-bjk7s\") pod \"route-controller-manager-5d48f6d844-vbbz2\" (UID: \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\") " pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.762998 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a8c44c6-a3f8-4676-90a5-79629327c1a0-client-ca\") pod \"route-controller-manager-5d48f6d844-vbbz2\" (UID: \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\") " pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.764505 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a8c44c6-a3f8-4676-90a5-79629327c1a0-client-ca\") pod \"route-controller-manager-5d48f6d844-vbbz2\" (UID: \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\") " pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.765466 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a8c44c6-a3f8-4676-90a5-79629327c1a0-config\") pod \"route-controller-manager-5d48f6d844-vbbz2\" (UID: \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\") " pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.769891 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a8c44c6-a3f8-4676-90a5-79629327c1a0-serving-cert\") pod \"route-controller-manager-5d48f6d844-vbbz2\" (UID: \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\") " pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.784325 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjk7s\" (UniqueName: \"kubernetes.io/projected/1a8c44c6-a3f8-4676-90a5-79629327c1a0-kube-api-access-bjk7s\") pod \"route-controller-manager-5d48f6d844-vbbz2\" (UID: \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\") " pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.865233 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.867180 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.869286 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.869716 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.872159 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.876474 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.965660 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/943856ce-59c3-4e9d-8b34-9053717caf7f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"943856ce-59c3-4e9d-8b34-9053717caf7f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 00:10:43 crc kubenswrapper[4867]: I0320 00:10:43.965734 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/943856ce-59c3-4e9d-8b34-9053717caf7f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"943856ce-59c3-4e9d-8b34-9053717caf7f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 00:10:44 crc kubenswrapper[4867]: I0320 00:10:44.066681 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/943856ce-59c3-4e9d-8b34-9053717caf7f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"943856ce-59c3-4e9d-8b34-9053717caf7f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 00:10:44 crc kubenswrapper[4867]: I0320 00:10:44.066755 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/943856ce-59c3-4e9d-8b34-9053717caf7f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"943856ce-59c3-4e9d-8b34-9053717caf7f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 00:10:44 crc kubenswrapper[4867]: I0320 00:10:44.066832 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/943856ce-59c3-4e9d-8b34-9053717caf7f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"943856ce-59c3-4e9d-8b34-9053717caf7f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 00:10:44 crc kubenswrapper[4867]: I0320 00:10:44.083023 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/943856ce-59c3-4e9d-8b34-9053717caf7f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"943856ce-59c3-4e9d-8b34-9053717caf7f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 00:10:44 crc kubenswrapper[4867]: I0320 00:10:44.192542 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 00:10:44 crc kubenswrapper[4867]: E0320 00:10:44.901016 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-992sr" podUID="e04729fe-8c2e-4265-aa19-5717138937bd" Mar 20 00:10:44 crc kubenswrapper[4867]: E0320 00:10:44.901058 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-68h4b" podUID="a24a1654-a84e-4b71-9209-dfd7e42941d0" Mar 20 00:10:44 crc kubenswrapper[4867]: E0320 00:10:44.963735 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 00:10:44 crc kubenswrapper[4867]: E0320 00:10:44.963909 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f2qm5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-pj4c8_openshift-marketplace(33637ed0-f47f-4095-b742-dd29244de21c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 00:10:44 crc kubenswrapper[4867]: E0320 00:10:44.965094 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-pj4c8" podUID="33637ed0-f47f-4095-b742-dd29244de21c" Mar 20 00:10:45 crc kubenswrapper[4867]: I0320 00:10:45.885773 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-68664d8549-l9fb5"] Mar 20 00:10:45 crc kubenswrapper[4867]: I0320 00:10:45.987780 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2"] Mar 20 00:10:46 crc kubenswrapper[4867]: E0320 00:10:46.548245 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-pj4c8" podUID="33637ed0-f47f-4095-b742-dd29244de21c" Mar 20 00:10:46 crc kubenswrapper[4867]: I0320 00:10:46.592074 4867 scope.go:117] "RemoveContainer" containerID="79c8416aa3ebe1ac57cddc0eb1c860411a3072a3cd8b501999da7ccf5984131c" Mar 20 00:10:46 crc kubenswrapper[4867]: I0320 00:10:46.595146 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29566080-vq54z" Mar 20 00:10:46 crc kubenswrapper[4867]: I0320 00:10:46.616385 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29566080-vq54z" Mar 20 00:10:46 crc kubenswrapper[4867]: I0320 00:10:46.616389 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29566080-vq54z" event={"ID":"c74119c8-4cf1-4c7b-9dba-49421dfe52e6","Type":"ContainerDied","Data":"f7363620a6e65acd15e636661b16c6bfd5a0b1da89fd34e0f5091acbc1abe61c"} Mar 20 00:10:46 crc kubenswrapper[4867]: I0320 00:10:46.616477 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7363620a6e65acd15e636661b16c6bfd5a0b1da89fd34e0f5091acbc1abe61c" Mar 20 00:10:46 crc kubenswrapper[4867]: E0320 00:10:46.620694 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 00:10:46 crc kubenswrapper[4867]: E0320 00:10:46.620908 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rg9hh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-w9sbs_openshift-marketplace(58c1b369-ec0e-43d0-a3ca-8c2b7b74d337): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 00:10:46 crc kubenswrapper[4867]: E0320 00:10:46.623746 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-w9sbs" podUID="58c1b369-ec0e-43d0-a3ca-8c2b7b74d337" Mar 20 00:10:46 crc kubenswrapper[4867]: E0320 00:10:46.660653 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 00:10:46 crc kubenswrapper[4867]: E0320 00:10:46.661283 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gghsf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-kjkcm_openshift-marketplace(8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 00:10:46 crc kubenswrapper[4867]: E0320 00:10:46.663160 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-kjkcm" podUID="8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9" Mar 20 00:10:46 crc kubenswrapper[4867]: E0320 00:10:46.670356 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 00:10:46 crc kubenswrapper[4867]: E0320 00:10:46.670598 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lhjrb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bxxsq_openshift-marketplace(38f94e88-d9e0-41a6-9dfe-390f2d709596): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 00:10:46 crc kubenswrapper[4867]: E0320 00:10:46.671858 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bxxsq" podUID="38f94e88-d9e0-41a6-9dfe-390f2d709596" Mar 20 00:10:46 crc kubenswrapper[4867]: I0320 00:10:46.703642 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c74119c8-4cf1-4c7b-9dba-49421dfe52e6-serviceca\") pod \"c74119c8-4cf1-4c7b-9dba-49421dfe52e6\" (UID: \"c74119c8-4cf1-4c7b-9dba-49421dfe52e6\") " Mar 20 00:10:46 crc kubenswrapper[4867]: I0320 00:10:46.703683 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw62n\" (UniqueName: \"kubernetes.io/projected/c74119c8-4cf1-4c7b-9dba-49421dfe52e6-kube-api-access-zw62n\") pod \"c74119c8-4cf1-4c7b-9dba-49421dfe52e6\" (UID: \"c74119c8-4cf1-4c7b-9dba-49421dfe52e6\") " Mar 20 00:10:46 crc kubenswrapper[4867]: I0320 00:10:46.705168 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c74119c8-4cf1-4c7b-9dba-49421dfe52e6-serviceca" (OuterVolumeSpecName: "serviceca") pod "c74119c8-4cf1-4c7b-9dba-49421dfe52e6" (UID: "c74119c8-4cf1-4c7b-9dba-49421dfe52e6"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:10:46 crc kubenswrapper[4867]: E0320 00:10:46.706617 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 00:10:46 crc kubenswrapper[4867]: E0320 00:10:46.706792 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s5v5h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-fsj5n_openshift-marketplace(61384875-b5b9-4757-839c-2071e973510c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 00:10:46 crc kubenswrapper[4867]: E0320 00:10:46.708182 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-fsj5n" podUID="61384875-b5b9-4757-839c-2071e973510c" Mar 20 00:10:46 crc kubenswrapper[4867]: I0320 00:10:46.710885 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c74119c8-4cf1-4c7b-9dba-49421dfe52e6-kube-api-access-zw62n" (OuterVolumeSpecName: "kube-api-access-zw62n") pod "c74119c8-4cf1-4c7b-9dba-49421dfe52e6" (UID: "c74119c8-4cf1-4c7b-9dba-49421dfe52e6"). InnerVolumeSpecName "kube-api-access-zw62n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:10:46 crc kubenswrapper[4867]: I0320 00:10:46.804797 4867 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c74119c8-4cf1-4c7b-9dba-49421dfe52e6-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:46 crc kubenswrapper[4867]: I0320 00:10:46.805169 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw62n\" (UniqueName: \"kubernetes.io/projected/c74119c8-4cf1-4c7b-9dba-49421dfe52e6-kube-api-access-zw62n\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:46 crc kubenswrapper[4867]: I0320 00:10:46.826024 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2"] Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.111115 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.114912 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-68664d8549-l9fb5"] Mar 20 00:10:47 crc kubenswrapper[4867]: W0320 00:10:47.128865 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod943856ce_59c3_4e9d_8b34_9053717caf7f.slice/crio-f9cda9ffebf7115ed585dc3ff887a54c210794e4dc8018c7a0b4c51d5643f1d7 WatchSource:0}: Error finding container f9cda9ffebf7115ed585dc3ff887a54c210794e4dc8018c7a0b4c51d5643f1d7: Status 404 returned error can't find the container with id f9cda9ffebf7115ed585dc3ff887a54c210794e4dc8018c7a0b4c51d5643f1d7 Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.626508 4867 generic.go:334] "Generic (PLEG): container finished" podID="b08fdffb-6c98-4f30-b70b-1592c40e01dc" containerID="f948beae7519ff5114e68af54bf173886c150d0197e87dc6a03f8732c4da109f" exitCode=0 Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.626600 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkf6s" event={"ID":"b08fdffb-6c98-4f30-b70b-1592c40e01dc","Type":"ContainerDied","Data":"f948beae7519ff5114e68af54bf173886c150d0197e87dc6a03f8732c4da109f"} Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.630114 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" event={"ID":"e7067225-d7e6-4e0e-b83a-c610db059f9a","Type":"ContainerStarted","Data":"947d324b8d1720d369ffd7e36048c850d6f3447b6ce119a6564ee2d4250ca79b"} Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.630156 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" event={"ID":"e7067225-d7e6-4e0e-b83a-c610db059f9a","Type":"ContainerStarted","Data":"0ee49a9138e096a67efe28411d118f173f1f3ae84a434e31f480465aa3905ad0"} Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.630159 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" podUID="e7067225-d7e6-4e0e-b83a-c610db059f9a" containerName="controller-manager" containerID="cri-o://947d324b8d1720d369ffd7e36048c850d6f3447b6ce119a6564ee2d4250ca79b" gracePeriod=30 Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.630339 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.637237 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"943856ce-59c3-4e9d-8b34-9053717caf7f","Type":"ContainerStarted","Data":"5cf01aca45b9edd440ecd2ba68bf3bc2a34e128a54f48d7c40269b73791dcb0b"} Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.637284 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"943856ce-59c3-4e9d-8b34-9053717caf7f","Type":"ContainerStarted","Data":"f9cda9ffebf7115ed585dc3ff887a54c210794e4dc8018c7a0b4c51d5643f1d7"} Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.638715 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" event={"ID":"1a8c44c6-a3f8-4676-90a5-79629327c1a0","Type":"ContainerStarted","Data":"8ce3dac8e5343f6b2c12ef1015a4c9d17a7eb468e757e1ba441adbaa6c6bcd39"} Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.638833 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" event={"ID":"1a8c44c6-a3f8-4676-90a5-79629327c1a0","Type":"ContainerStarted","Data":"c6395647e07bce5d32c0f8b0c4ec807c389c9da8568a52dd4c7a29b2dc9a2e14"} Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.638932 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.638823 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" podUID="1a8c44c6-a3f8-4676-90a5-79629327c1a0" containerName="route-controller-manager" containerID="cri-o://8ce3dac8e5343f6b2c12ef1015a4c9d17a7eb468e757e1ba441adbaa6c6bcd39" gracePeriod=30 Mar 20 00:10:47 crc kubenswrapper[4867]: E0320 00:10:47.640902 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-fsj5n" podUID="61384875-b5b9-4757-839c-2071e973510c" Mar 20 00:10:47 crc kubenswrapper[4867]: E0320 00:10:47.641167 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-kjkcm" podUID="8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9" Mar 20 00:10:47 crc kubenswrapper[4867]: E0320 00:10:47.641212 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-w9sbs" podUID="58c1b369-ec0e-43d0-a3ca-8c2b7b74d337" Mar 20 00:10:47 crc kubenswrapper[4867]: E0320 00:10:47.641243 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bxxsq" podUID="38f94e88-d9e0-41a6-9dfe-390f2d709596" Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.649336 4867 patch_prober.go:28] interesting pod/controller-manager-68664d8549-l9fb5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": read tcp 10.217.0.2:39652->10.217.0.60:8443: read: connection reset by peer" start-of-body= Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.649415 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" podUID="e7067225-d7e6-4e0e-b83a-c610db059f9a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": read tcp 10.217.0.2:39652->10.217.0.60:8443: read: connection reset by peer" Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.655965 4867 patch_prober.go:28] interesting pod/route-controller-manager-5d48f6d844-vbbz2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": EOF" start-of-body= Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.656024 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" podUID="1a8c44c6-a3f8-4676-90a5-79629327c1a0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": EOF" Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.670120 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" podStartSLOduration=22.670104771 podStartE2EDuration="22.670104771s" podCreationTimestamp="2026-03-20 00:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:47.668305415 +0000 UTC m=+261.894842942" watchObservedRunningTime="2026-03-20 00:10:47.670104771 +0000 UTC m=+261.896642288" Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.684483 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=4.684468394 podStartE2EDuration="4.684468394s" podCreationTimestamp="2026-03-20 00:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:47.682651588 +0000 UTC m=+261.909189105" watchObservedRunningTime="2026-03-20 00:10:47.684468394 +0000 UTC m=+261.911005911" Mar 20 00:10:47 crc kubenswrapper[4867]: I0320 00:10:47.737196 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" podStartSLOduration=22.737153456 podStartE2EDuration="22.737153456s" podCreationTimestamp="2026-03-20 00:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:47.733503832 +0000 UTC m=+261.960041359" watchObservedRunningTime="2026-03-20 00:10:47.737153456 +0000 UTC m=+261.963690983" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.001069 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.005801 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.124024 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jjns\" (UniqueName: \"kubernetes.io/projected/e7067225-d7e6-4e0e-b83a-c610db059f9a-kube-api-access-6jjns\") pod \"e7067225-d7e6-4e0e-b83a-c610db059f9a\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.124089 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7067225-d7e6-4e0e-b83a-c610db059f9a-proxy-ca-bundles\") pod \"e7067225-d7e6-4e0e-b83a-c610db059f9a\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.124157 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7067225-d7e6-4e0e-b83a-c610db059f9a-config\") pod \"e7067225-d7e6-4e0e-b83a-c610db059f9a\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.124187 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjk7s\" (UniqueName: \"kubernetes.io/projected/1a8c44c6-a3f8-4676-90a5-79629327c1a0-kube-api-access-bjk7s\") pod \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\" (UID: \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\") " Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.124234 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a8c44c6-a3f8-4676-90a5-79629327c1a0-config\") pod \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\" (UID: \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\") " Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.124278 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7067225-d7e6-4e0e-b83a-c610db059f9a-serving-cert\") pod \"e7067225-d7e6-4e0e-b83a-c610db059f9a\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.124323 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a8c44c6-a3f8-4676-90a5-79629327c1a0-serving-cert\") pod \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\" (UID: \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\") " Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.124386 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7067225-d7e6-4e0e-b83a-c610db059f9a-client-ca\") pod \"e7067225-d7e6-4e0e-b83a-c610db059f9a\" (UID: \"e7067225-d7e6-4e0e-b83a-c610db059f9a\") " Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.124414 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a8c44c6-a3f8-4676-90a5-79629327c1a0-client-ca\") pod \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\" (UID: \"1a8c44c6-a3f8-4676-90a5-79629327c1a0\") " Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.126009 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7067225-d7e6-4e0e-b83a-c610db059f9a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e7067225-d7e6-4e0e-b83a-c610db059f9a" (UID: "e7067225-d7e6-4e0e-b83a-c610db059f9a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.126134 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7067225-d7e6-4e0e-b83a-c610db059f9a-client-ca" (OuterVolumeSpecName: "client-ca") pod "e7067225-d7e6-4e0e-b83a-c610db059f9a" (UID: "e7067225-d7e6-4e0e-b83a-c610db059f9a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.126182 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7067225-d7e6-4e0e-b83a-c610db059f9a-config" (OuterVolumeSpecName: "config") pod "e7067225-d7e6-4e0e-b83a-c610db059f9a" (UID: "e7067225-d7e6-4e0e-b83a-c610db059f9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.126647 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a8c44c6-a3f8-4676-90a5-79629327c1a0-client-ca" (OuterVolumeSpecName: "client-ca") pod "1a8c44c6-a3f8-4676-90a5-79629327c1a0" (UID: "1a8c44c6-a3f8-4676-90a5-79629327c1a0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.126716 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a8c44c6-a3f8-4676-90a5-79629327c1a0-config" (OuterVolumeSpecName: "config") pod "1a8c44c6-a3f8-4676-90a5-79629327c1a0" (UID: "1a8c44c6-a3f8-4676-90a5-79629327c1a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.130046 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7067225-d7e6-4e0e-b83a-c610db059f9a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7067225-d7e6-4e0e-b83a-c610db059f9a" (UID: "e7067225-d7e6-4e0e-b83a-c610db059f9a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.130188 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a8c44c6-a3f8-4676-90a5-79629327c1a0-kube-api-access-bjk7s" (OuterVolumeSpecName: "kube-api-access-bjk7s") pod "1a8c44c6-a3f8-4676-90a5-79629327c1a0" (UID: "1a8c44c6-a3f8-4676-90a5-79629327c1a0"). InnerVolumeSpecName "kube-api-access-bjk7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.130214 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a8c44c6-a3f8-4676-90a5-79629327c1a0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1a8c44c6-a3f8-4676-90a5-79629327c1a0" (UID: "1a8c44c6-a3f8-4676-90a5-79629327c1a0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.134531 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7067225-d7e6-4e0e-b83a-c610db059f9a-kube-api-access-6jjns" (OuterVolumeSpecName: "kube-api-access-6jjns") pod "e7067225-d7e6-4e0e-b83a-c610db059f9a" (UID: "e7067225-d7e6-4e0e-b83a-c610db059f9a"). InnerVolumeSpecName "kube-api-access-6jjns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.225347 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7067225-d7e6-4e0e-b83a-c610db059f9a-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.225643 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjk7s\" (UniqueName: \"kubernetes.io/projected/1a8c44c6-a3f8-4676-90a5-79629327c1a0-kube-api-access-bjk7s\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.225658 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a8c44c6-a3f8-4676-90a5-79629327c1a0-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.225668 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7067225-d7e6-4e0e-b83a-c610db059f9a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.225679 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a8c44c6-a3f8-4676-90a5-79629327c1a0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.225691 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7067225-d7e6-4e0e-b83a-c610db059f9a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.225700 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a8c44c6-a3f8-4676-90a5-79629327c1a0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.225711 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jjns\" (UniqueName: \"kubernetes.io/projected/e7067225-d7e6-4e0e-b83a-c610db059f9a-kube-api-access-6jjns\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.225721 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7067225-d7e6-4e0e-b83a-c610db059f9a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.649846 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkf6s" event={"ID":"b08fdffb-6c98-4f30-b70b-1592c40e01dc","Type":"ContainerStarted","Data":"9b98619835a00f8b5a6d242d80093903415f3b604374fee52c2112448554f1c5"} Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.652576 4867 generic.go:334] "Generic (PLEG): container finished" podID="e7067225-d7e6-4e0e-b83a-c610db059f9a" containerID="947d324b8d1720d369ffd7e36048c850d6f3447b6ce119a6564ee2d4250ca79b" exitCode=0 Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.652655 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" event={"ID":"e7067225-d7e6-4e0e-b83a-c610db059f9a","Type":"ContainerDied","Data":"947d324b8d1720d369ffd7e36048c850d6f3447b6ce119a6564ee2d4250ca79b"} Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.652673 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" event={"ID":"e7067225-d7e6-4e0e-b83a-c610db059f9a","Type":"ContainerDied","Data":"0ee49a9138e096a67efe28411d118f173f1f3ae84a434e31f480465aa3905ad0"} Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.652677 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68664d8549-l9fb5" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.652690 4867 scope.go:117] "RemoveContainer" containerID="947d324b8d1720d369ffd7e36048c850d6f3447b6ce119a6564ee2d4250ca79b" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.655968 4867 generic.go:334] "Generic (PLEG): container finished" podID="943856ce-59c3-4e9d-8b34-9053717caf7f" containerID="5cf01aca45b9edd440ecd2ba68bf3bc2a34e128a54f48d7c40269b73791dcb0b" exitCode=0 Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.656289 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"943856ce-59c3-4e9d-8b34-9053717caf7f","Type":"ContainerDied","Data":"5cf01aca45b9edd440ecd2ba68bf3bc2a34e128a54f48d7c40269b73791dcb0b"} Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.661011 4867 generic.go:334] "Generic (PLEG): container finished" podID="1a8c44c6-a3f8-4676-90a5-79629327c1a0" containerID="8ce3dac8e5343f6b2c12ef1015a4c9d17a7eb468e757e1ba441adbaa6c6bcd39" exitCode=0 Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.661057 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" event={"ID":"1a8c44c6-a3f8-4676-90a5-79629327c1a0","Type":"ContainerDied","Data":"8ce3dac8e5343f6b2c12ef1015a4c9d17a7eb468e757e1ba441adbaa6c6bcd39"} Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.661083 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" event={"ID":"1a8c44c6-a3f8-4676-90a5-79629327c1a0","Type":"ContainerDied","Data":"c6395647e07bce5d32c0f8b0c4ec807c389c9da8568a52dd4c7a29b2dc9a2e14"} Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.661188 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.674533 4867 scope.go:117] "RemoveContainer" containerID="947d324b8d1720d369ffd7e36048c850d6f3447b6ce119a6564ee2d4250ca79b" Mar 20 00:10:48 crc kubenswrapper[4867]: E0320 00:10:48.675962 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"947d324b8d1720d369ffd7e36048c850d6f3447b6ce119a6564ee2d4250ca79b\": container with ID starting with 947d324b8d1720d369ffd7e36048c850d6f3447b6ce119a6564ee2d4250ca79b not found: ID does not exist" containerID="947d324b8d1720d369ffd7e36048c850d6f3447b6ce119a6564ee2d4250ca79b" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.675999 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"947d324b8d1720d369ffd7e36048c850d6f3447b6ce119a6564ee2d4250ca79b"} err="failed to get container status \"947d324b8d1720d369ffd7e36048c850d6f3447b6ce119a6564ee2d4250ca79b\": rpc error: code = NotFound desc = could not find container \"947d324b8d1720d369ffd7e36048c850d6f3447b6ce119a6564ee2d4250ca79b\": container with ID starting with 947d324b8d1720d369ffd7e36048c850d6f3447b6ce119a6564ee2d4250ca79b not found: ID does not exist" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.676021 4867 scope.go:117] "RemoveContainer" containerID="8ce3dac8e5343f6b2c12ef1015a4c9d17a7eb468e757e1ba441adbaa6c6bcd39" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.711152 4867 scope.go:117] "RemoveContainer" containerID="8ce3dac8e5343f6b2c12ef1015a4c9d17a7eb468e757e1ba441adbaa6c6bcd39" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.711094 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mkf6s" podStartSLOduration=2.6760066030000003 podStartE2EDuration="36.711059805s" podCreationTimestamp="2026-03-20 00:10:12 +0000 UTC" firstStartedPulling="2026-03-20 00:10:14.293637328 +0000 UTC m=+228.520174845" lastFinishedPulling="2026-03-20 00:10:48.32869053 +0000 UTC m=+262.555228047" observedRunningTime="2026-03-20 00:10:48.669528489 +0000 UTC m=+262.896066046" watchObservedRunningTime="2026-03-20 00:10:48.711059805 +0000 UTC m=+262.937597402" Mar 20 00:10:48 crc kubenswrapper[4867]: E0320 00:10:48.711982 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ce3dac8e5343f6b2c12ef1015a4c9d17a7eb468e757e1ba441adbaa6c6bcd39\": container with ID starting with 8ce3dac8e5343f6b2c12ef1015a4c9d17a7eb468e757e1ba441adbaa6c6bcd39 not found: ID does not exist" containerID="8ce3dac8e5343f6b2c12ef1015a4c9d17a7eb468e757e1ba441adbaa6c6bcd39" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.712012 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ce3dac8e5343f6b2c12ef1015a4c9d17a7eb468e757e1ba441adbaa6c6bcd39"} err="failed to get container status \"8ce3dac8e5343f6b2c12ef1015a4c9d17a7eb468e757e1ba441adbaa6c6bcd39\": rpc error: code = NotFound desc = could not find container \"8ce3dac8e5343f6b2c12ef1015a4c9d17a7eb468e757e1ba441adbaa6c6bcd39\": container with ID starting with 8ce3dac8e5343f6b2c12ef1015a4c9d17a7eb468e757e1ba441adbaa6c6bcd39 not found: ID does not exist" Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.722303 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-68664d8549-l9fb5"] Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.728026 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-68664d8549-l9fb5"] Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.732609 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2"] Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.738008 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d48f6d844-vbbz2"] Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.860032 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:10:48 crc kubenswrapper[4867]: I0320 00:10:48.860116 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.543104 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5"] Mar 20 00:10:49 crc kubenswrapper[4867]: E0320 00:10:49.543451 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8c44c6-a3f8-4676-90a5-79629327c1a0" containerName="route-controller-manager" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.543472 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8c44c6-a3f8-4676-90a5-79629327c1a0" containerName="route-controller-manager" Mar 20 00:10:49 crc kubenswrapper[4867]: E0320 00:10:49.543528 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7067225-d7e6-4e0e-b83a-c610db059f9a" containerName="controller-manager" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.543542 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7067225-d7e6-4e0e-b83a-c610db059f9a" containerName="controller-manager" Mar 20 00:10:49 crc kubenswrapper[4867]: E0320 00:10:49.543572 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74119c8-4cf1-4c7b-9dba-49421dfe52e6" containerName="image-pruner" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.543585 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74119c8-4cf1-4c7b-9dba-49421dfe52e6" containerName="image-pruner" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.543757 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7067225-d7e6-4e0e-b83a-c610db059f9a" containerName="controller-manager" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.543778 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c74119c8-4cf1-4c7b-9dba-49421dfe52e6" containerName="image-pruner" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.543796 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a8c44c6-a3f8-4676-90a5-79629327c1a0" containerName="route-controller-manager" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.544427 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.546581 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7cc6595674-hm274"] Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.546783 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.547367 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.549437 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.549902 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.550084 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.551263 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.552591 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.552821 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.553474 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.553823 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.553844 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.554378 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.563210 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.565289 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.565464 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5"] Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.622599 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cc6595674-hm274"] Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.745514 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0c542d-82d1-4dd4-9607-05ab0325e90e-serving-cert\") pod \"controller-manager-7cc6595674-hm274\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.745922 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f0c542d-82d1-4dd4-9607-05ab0325e90e-proxy-ca-bundles\") pod \"controller-manager-7cc6595674-hm274\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.746041 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0c542d-82d1-4dd4-9607-05ab0325e90e-config\") pod \"controller-manager-7cc6595674-hm274\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.746102 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0c542d-82d1-4dd4-9607-05ab0325e90e-client-ca\") pod \"controller-manager-7cc6595674-hm274\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.746149 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz6cj\" (UniqueName: \"kubernetes.io/projected/9f0c542d-82d1-4dd4-9607-05ab0325e90e-kube-api-access-jz6cj\") pod \"controller-manager-7cc6595674-hm274\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.746185 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b79faf1-1e7c-46c4-a0ba-f452920d748d-client-ca\") pod \"route-controller-manager-549984b657-jnxb5\" (UID: \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\") " pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.746213 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b79faf1-1e7c-46c4-a0ba-f452920d748d-serving-cert\") pod \"route-controller-manager-549984b657-jnxb5\" (UID: \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\") " pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.746462 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc52j\" (UniqueName: \"kubernetes.io/projected/1b79faf1-1e7c-46c4-a0ba-f452920d748d-kube-api-access-cc52j\") pod \"route-controller-manager-549984b657-jnxb5\" (UID: \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\") " pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.746545 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b79faf1-1e7c-46c4-a0ba-f452920d748d-config\") pod \"route-controller-manager-549984b657-jnxb5\" (UID: \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\") " pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.847105 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f0c542d-82d1-4dd4-9607-05ab0325e90e-proxy-ca-bundles\") pod \"controller-manager-7cc6595674-hm274\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.847158 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0c542d-82d1-4dd4-9607-05ab0325e90e-config\") pod \"controller-manager-7cc6595674-hm274\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.847184 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0c542d-82d1-4dd4-9607-05ab0325e90e-client-ca\") pod \"controller-manager-7cc6595674-hm274\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.847211 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz6cj\" (UniqueName: \"kubernetes.io/projected/9f0c542d-82d1-4dd4-9607-05ab0325e90e-kube-api-access-jz6cj\") pod \"controller-manager-7cc6595674-hm274\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.847237 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b79faf1-1e7c-46c4-a0ba-f452920d748d-client-ca\") pod \"route-controller-manager-549984b657-jnxb5\" (UID: \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\") " pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.847255 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b79faf1-1e7c-46c4-a0ba-f452920d748d-serving-cert\") pod \"route-controller-manager-549984b657-jnxb5\" (UID: \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\") " pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.847301 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc52j\" (UniqueName: \"kubernetes.io/projected/1b79faf1-1e7c-46c4-a0ba-f452920d748d-kube-api-access-cc52j\") pod \"route-controller-manager-549984b657-jnxb5\" (UID: \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\") " pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.847329 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b79faf1-1e7c-46c4-a0ba-f452920d748d-config\") pod \"route-controller-manager-549984b657-jnxb5\" (UID: \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\") " pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.847360 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0c542d-82d1-4dd4-9607-05ab0325e90e-serving-cert\") pod \"controller-manager-7cc6595674-hm274\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.848825 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0c542d-82d1-4dd4-9607-05ab0325e90e-client-ca\") pod \"controller-manager-7cc6595674-hm274\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.848837 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b79faf1-1e7c-46c4-a0ba-f452920d748d-config\") pod \"route-controller-manager-549984b657-jnxb5\" (UID: \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\") " pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.849449 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0c542d-82d1-4dd4-9607-05ab0325e90e-config\") pod \"controller-manager-7cc6595674-hm274\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.849839 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b79faf1-1e7c-46c4-a0ba-f452920d748d-client-ca\") pod \"route-controller-manager-549984b657-jnxb5\" (UID: \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\") " pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.850448 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f0c542d-82d1-4dd4-9607-05ab0325e90e-proxy-ca-bundles\") pod \"controller-manager-7cc6595674-hm274\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.854456 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0c542d-82d1-4dd4-9607-05ab0325e90e-serving-cert\") pod \"controller-manager-7cc6595674-hm274\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.854535 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b79faf1-1e7c-46c4-a0ba-f452920d748d-serving-cert\") pod \"route-controller-manager-549984b657-jnxb5\" (UID: \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\") " pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.864920 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.865579 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.872245 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc52j\" (UniqueName: \"kubernetes.io/projected/1b79faf1-1e7c-46c4-a0ba-f452920d748d-kube-api-access-cc52j\") pod \"route-controller-manager-549984b657-jnxb5\" (UID: \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\") " pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.875604 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.889068 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz6cj\" (UniqueName: \"kubernetes.io/projected/9f0c542d-82d1-4dd4-9607-05ab0325e90e-kube-api-access-jz6cj\") pod \"controller-manager-7cc6595674-hm274\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.914277 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.923885 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.948703 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8444d01-ed92-4d2f-9c64-d0b084e606e3-kube-api-access\") pod \"installer-9-crc\" (UID: \"f8444d01-ed92-4d2f-9c64-d0b084e606e3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.948801 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8444d01-ed92-4d2f-9c64-d0b084e606e3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f8444d01-ed92-4d2f-9c64-d0b084e606e3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.948881 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f8444d01-ed92-4d2f-9c64-d0b084e606e3-var-lock\") pod \"installer-9-crc\" (UID: \"f8444d01-ed92-4d2f-9c64-d0b084e606e3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 00:10:49 crc kubenswrapper[4867]: I0320 00:10:49.966571 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.049754 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/943856ce-59c3-4e9d-8b34-9053717caf7f-kube-api-access\") pod \"943856ce-59c3-4e9d-8b34-9053717caf7f\" (UID: \"943856ce-59c3-4e9d-8b34-9053717caf7f\") " Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.050094 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/943856ce-59c3-4e9d-8b34-9053717caf7f-kubelet-dir\") pod \"943856ce-59c3-4e9d-8b34-9053717caf7f\" (UID: \"943856ce-59c3-4e9d-8b34-9053717caf7f\") " Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.050247 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f8444d01-ed92-4d2f-9c64-d0b084e606e3-var-lock\") pod \"installer-9-crc\" (UID: \"f8444d01-ed92-4d2f-9c64-d0b084e606e3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.050311 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8444d01-ed92-4d2f-9c64-d0b084e606e3-kube-api-access\") pod \"installer-9-crc\" (UID: \"f8444d01-ed92-4d2f-9c64-d0b084e606e3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.050339 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8444d01-ed92-4d2f-9c64-d0b084e606e3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f8444d01-ed92-4d2f-9c64-d0b084e606e3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.050424 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8444d01-ed92-4d2f-9c64-d0b084e606e3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f8444d01-ed92-4d2f-9c64-d0b084e606e3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.050470 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/943856ce-59c3-4e9d-8b34-9053717caf7f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "943856ce-59c3-4e9d-8b34-9053717caf7f" (UID: "943856ce-59c3-4e9d-8b34-9053717caf7f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.050524 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f8444d01-ed92-4d2f-9c64-d0b084e606e3-var-lock\") pod \"installer-9-crc\" (UID: \"f8444d01-ed92-4d2f-9c64-d0b084e606e3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.073868 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/943856ce-59c3-4e9d-8b34-9053717caf7f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "943856ce-59c3-4e9d-8b34-9053717caf7f" (UID: "943856ce-59c3-4e9d-8b34-9053717caf7f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.077509 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8444d01-ed92-4d2f-9c64-d0b084e606e3-kube-api-access\") pod \"installer-9-crc\" (UID: \"f8444d01-ed92-4d2f-9c64-d0b084e606e3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.111822 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5"] Mar 20 00:10:50 crc kubenswrapper[4867]: W0320 00:10:50.116957 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b79faf1_1e7c_46c4_a0ba_f452920d748d.slice/crio-df08fd4cb52d33cdc4ff6a87d2dc340cbeaf832f3fb5c886c207cb8858583d5f WatchSource:0}: Error finding container df08fd4cb52d33cdc4ff6a87d2dc340cbeaf832f3fb5c886c207cb8858583d5f: Status 404 returned error can't find the container with id df08fd4cb52d33cdc4ff6a87d2dc340cbeaf832f3fb5c886c207cb8858583d5f Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.151647 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/943856ce-59c3-4e9d-8b34-9053717caf7f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.151676 4867 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/943856ce-59c3-4e9d-8b34-9053717caf7f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.165268 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cc6595674-hm274"] Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.264706 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.445445 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a8c44c6-a3f8-4676-90a5-79629327c1a0" path="/var/lib/kubelet/pods/1a8c44c6-a3f8-4676-90a5-79629327c1a0/volumes" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.446578 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7067225-d7e6-4e0e-b83a-c610db059f9a" path="/var/lib/kubelet/pods/e7067225-d7e6-4e0e-b83a-c610db059f9a/volumes" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.661510 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 00:10:50 crc kubenswrapper[4867]: W0320 00:10:50.666228 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf8444d01_ed92_4d2f_9c64_d0b084e606e3.slice/crio-3c68b36d95239e1a6814ac6ab2ed26283627e7aed8e4b7ff8c566905a9172a89 WatchSource:0}: Error finding container 3c68b36d95239e1a6814ac6ab2ed26283627e7aed8e4b7ff8c566905a9172a89: Status 404 returned error can't find the container with id 3c68b36d95239e1a6814ac6ab2ed26283627e7aed8e4b7ff8c566905a9172a89 Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.684681 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" event={"ID":"9f0c542d-82d1-4dd4-9607-05ab0325e90e","Type":"ContainerStarted","Data":"6447f6d467b2431cdd529f6d0c4f65c688a1ab7af2bfb35cbb3af64c3ab2e454"} Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.684725 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" event={"ID":"9f0c542d-82d1-4dd4-9607-05ab0325e90e","Type":"ContainerStarted","Data":"5392f18b1be81b5e2664be06430494dfeae427d4358532b095fd1645d56cb15e"} Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.685795 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.687300 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" event={"ID":"1b79faf1-1e7c-46c4-a0ba-f452920d748d","Type":"ContainerStarted","Data":"5694c7ce8166c0091bf9ed6c7d510f664e73c1f2895655fb085ffc6e5d332e62"} Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.687325 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" event={"ID":"1b79faf1-1e7c-46c4-a0ba-f452920d748d","Type":"ContainerStarted","Data":"df08fd4cb52d33cdc4ff6a87d2dc340cbeaf832f3fb5c886c207cb8858583d5f"} Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.687783 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.692382 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.692711 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"943856ce-59c3-4e9d-8b34-9053717caf7f","Type":"ContainerDied","Data":"f9cda9ffebf7115ed585dc3ff887a54c210794e4dc8018c7a0b4c51d5643f1d7"} Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.692964 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9cda9ffebf7115ed585dc3ff887a54c210794e4dc8018c7a0b4c51d5643f1d7" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.692753 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.694555 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f8444d01-ed92-4d2f-9c64-d0b084e606e3","Type":"ContainerStarted","Data":"3c68b36d95239e1a6814ac6ab2ed26283627e7aed8e4b7ff8c566905a9172a89"} Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.709016 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" podStartSLOduration=5.708999724 podStartE2EDuration="5.708999724s" podCreationTimestamp="2026-03-20 00:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:50.705623377 +0000 UTC m=+264.932160914" watchObservedRunningTime="2026-03-20 00:10:50.708999724 +0000 UTC m=+264.935537241" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.753954 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" podStartSLOduration=5.753937278 podStartE2EDuration="5.753937278s" podCreationTimestamp="2026-03-20 00:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:50.753119177 +0000 UTC m=+264.979656724" watchObservedRunningTime="2026-03-20 00:10:50.753937278 +0000 UTC m=+264.980474795" Mar 20 00:10:50 crc kubenswrapper[4867]: I0320 00:10:50.802535 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" Mar 20 00:10:51 crc kubenswrapper[4867]: I0320 00:10:51.702003 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f8444d01-ed92-4d2f-9c64-d0b084e606e3","Type":"ContainerStarted","Data":"25968288f06038befe97e23d983146f23df87d419a6e684f476a5672c2274aa5"} Mar 20 00:10:51 crc kubenswrapper[4867]: I0320 00:10:51.706148 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566088-pbdgz" event={"ID":"e91db560-3692-4697-bbf0-3c5a8438d5e5","Type":"ContainerStarted","Data":"a7e3db854172fcfe7fd52c5129cfbbb027b3bb941031a8dd48e9af48f157292e"} Mar 20 00:10:51 crc kubenswrapper[4867]: I0320 00:10:51.727381 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.727365475 podStartE2EDuration="2.727365475s" podCreationTimestamp="2026-03-20 00:10:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:10:51.723979687 +0000 UTC m=+265.950517204" watchObservedRunningTime="2026-03-20 00:10:51.727365475 +0000 UTC m=+265.953902992" Mar 20 00:10:51 crc kubenswrapper[4867]: I0320 00:10:51.743583 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566088-pbdgz" podStartSLOduration=125.877578691 podStartE2EDuration="2m51.743566265s" podCreationTimestamp="2026-03-20 00:08:00 +0000 UTC" firstStartedPulling="2026-03-20 00:10:05.566339819 +0000 UTC m=+219.792877336" lastFinishedPulling="2026-03-20 00:10:51.432327393 +0000 UTC m=+265.658864910" observedRunningTime="2026-03-20 00:10:51.741316487 +0000 UTC m=+265.967854004" watchObservedRunningTime="2026-03-20 00:10:51.743566265 +0000 UTC m=+265.970103782" Mar 20 00:10:52 crc kubenswrapper[4867]: I0320 00:10:52.042690 4867 csr.go:261] certificate signing request csr-lskx9 is approved, waiting to be issued Mar 20 00:10:52 crc kubenswrapper[4867]: I0320 00:10:52.048508 4867 csr.go:257] certificate signing request csr-lskx9 is issued Mar 20 00:10:52 crc kubenswrapper[4867]: I0320 00:10:52.523653 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mkf6s" Mar 20 00:10:52 crc kubenswrapper[4867]: I0320 00:10:52.523712 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mkf6s" Mar 20 00:10:52 crc kubenswrapper[4867]: I0320 00:10:52.711789 4867 generic.go:334] "Generic (PLEG): container finished" podID="e91db560-3692-4697-bbf0-3c5a8438d5e5" containerID="a7e3db854172fcfe7fd52c5129cfbbb027b3bb941031a8dd48e9af48f157292e" exitCode=0 Mar 20 00:10:52 crc kubenswrapper[4867]: I0320 00:10:52.711881 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566088-pbdgz" event={"ID":"e91db560-3692-4697-bbf0-3c5a8438d5e5","Type":"ContainerDied","Data":"a7e3db854172fcfe7fd52c5129cfbbb027b3bb941031a8dd48e9af48f157292e"} Mar 20 00:10:53 crc kubenswrapper[4867]: I0320 00:10:53.050502 4867 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-23 08:36:44.789324207 +0000 UTC Mar 20 00:10:53 crc kubenswrapper[4867]: I0320 00:10:53.050540 4867 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6680h25m51.738786114s for next certificate rotation Mar 20 00:10:53 crc kubenswrapper[4867]: I0320 00:10:53.650641 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mkf6s" podUID="b08fdffb-6c98-4f30-b70b-1592c40e01dc" containerName="registry-server" probeResult="failure" output=< Mar 20 00:10:53 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Mar 20 00:10:53 crc kubenswrapper[4867]: > Mar 20 00:10:53 crc kubenswrapper[4867]: I0320 00:10:53.976116 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566088-pbdgz" Mar 20 00:10:54 crc kubenswrapper[4867]: I0320 00:10:54.007130 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bms72\" (UniqueName: \"kubernetes.io/projected/e91db560-3692-4697-bbf0-3c5a8438d5e5-kube-api-access-bms72\") pod \"e91db560-3692-4697-bbf0-3c5a8438d5e5\" (UID: \"e91db560-3692-4697-bbf0-3c5a8438d5e5\") " Mar 20 00:10:54 crc kubenswrapper[4867]: I0320 00:10:54.017048 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e91db560-3692-4697-bbf0-3c5a8438d5e5-kube-api-access-bms72" (OuterVolumeSpecName: "kube-api-access-bms72") pod "e91db560-3692-4697-bbf0-3c5a8438d5e5" (UID: "e91db560-3692-4697-bbf0-3c5a8438d5e5"). InnerVolumeSpecName "kube-api-access-bms72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:10:54 crc kubenswrapper[4867]: I0320 00:10:54.051676 4867 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-19 03:41:27.545942959 +0000 UTC Mar 20 00:10:54 crc kubenswrapper[4867]: I0320 00:10:54.051709 4867 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6579h30m33.494236509s for next certificate rotation Mar 20 00:10:54 crc kubenswrapper[4867]: I0320 00:10:54.108899 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bms72\" (UniqueName: \"kubernetes.io/projected/e91db560-3692-4697-bbf0-3c5a8438d5e5-kube-api-access-bms72\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:54 crc kubenswrapper[4867]: I0320 00:10:54.722628 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566088-pbdgz" event={"ID":"e91db560-3692-4697-bbf0-3c5a8438d5e5","Type":"ContainerDied","Data":"ff1968471b32a160d4d09b2d1828d15e7279f81c9a3b439e1dcbd29bdde35523"} Mar 20 00:10:54 crc kubenswrapper[4867]: I0320 00:10:54.722679 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff1968471b32a160d4d09b2d1828d15e7279f81c9a3b439e1dcbd29bdde35523" Mar 20 00:10:54 crc kubenswrapper[4867]: I0320 00:10:54.722713 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566088-pbdgz" Mar 20 00:10:55 crc kubenswrapper[4867]: I0320 00:10:55.730084 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566090-fczg6" event={"ID":"a4b1008c-a714-49a2-8c17-e971afc302af","Type":"ContainerStarted","Data":"ded3131ae9568b714d2611623f59bf8d400b6e7d047b4867c92aa7440f7e7425"} Mar 20 00:10:56 crc kubenswrapper[4867]: I0320 00:10:56.441711 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566090-fczg6" podStartSLOduration=6.215249608 podStartE2EDuration="56.441693264s" podCreationTimestamp="2026-03-20 00:10:00 +0000 UTC" firstStartedPulling="2026-03-20 00:10:05.099902544 +0000 UTC m=+219.326440061" lastFinishedPulling="2026-03-20 00:10:55.32634621 +0000 UTC m=+269.552883717" observedRunningTime="2026-03-20 00:10:55.746417112 +0000 UTC m=+269.972954629" watchObservedRunningTime="2026-03-20 00:10:56.441693264 +0000 UTC m=+270.668230781" Mar 20 00:10:56 crc kubenswrapper[4867]: I0320 00:10:56.739358 4867 generic.go:334] "Generic (PLEG): container finished" podID="a4b1008c-a714-49a2-8c17-e971afc302af" containerID="ded3131ae9568b714d2611623f59bf8d400b6e7d047b4867c92aa7440f7e7425" exitCode=0 Mar 20 00:10:56 crc kubenswrapper[4867]: I0320 00:10:56.739406 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566090-fczg6" event={"ID":"a4b1008c-a714-49a2-8c17-e971afc302af","Type":"ContainerDied","Data":"ded3131ae9568b714d2611623f59bf8d400b6e7d047b4867c92aa7440f7e7425"} Mar 20 00:10:58 crc kubenswrapper[4867]: I0320 00:10:58.086868 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566090-fczg6" Mar 20 00:10:58 crc kubenswrapper[4867]: I0320 00:10:58.164209 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm7lb\" (UniqueName: \"kubernetes.io/projected/a4b1008c-a714-49a2-8c17-e971afc302af-kube-api-access-sm7lb\") pod \"a4b1008c-a714-49a2-8c17-e971afc302af\" (UID: \"a4b1008c-a714-49a2-8c17-e971afc302af\") " Mar 20 00:10:58 crc kubenswrapper[4867]: I0320 00:10:58.171064 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4b1008c-a714-49a2-8c17-e971afc302af-kube-api-access-sm7lb" (OuterVolumeSpecName: "kube-api-access-sm7lb") pod "a4b1008c-a714-49a2-8c17-e971afc302af" (UID: "a4b1008c-a714-49a2-8c17-e971afc302af"). InnerVolumeSpecName "kube-api-access-sm7lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:10:58 crc kubenswrapper[4867]: I0320 00:10:58.266016 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm7lb\" (UniqueName: \"kubernetes.io/projected/a4b1008c-a714-49a2-8c17-e971afc302af-kube-api-access-sm7lb\") on node \"crc\" DevicePath \"\"" Mar 20 00:10:58 crc kubenswrapper[4867]: I0320 00:10:58.752031 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566090-fczg6" event={"ID":"a4b1008c-a714-49a2-8c17-e971afc302af","Type":"ContainerDied","Data":"d9874503b86a06a232a1ce0c7a309a3e4c773b14962b9d3286eb5221782a1e69"} Mar 20 00:10:58 crc kubenswrapper[4867]: I0320 00:10:58.752417 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9874503b86a06a232a1ce0c7a309a3e4c773b14962b9d3286eb5221782a1e69" Mar 20 00:10:58 crc kubenswrapper[4867]: I0320 00:10:58.752104 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566090-fczg6" Mar 20 00:11:00 crc kubenswrapper[4867]: I0320 00:11:00.770362 4867 generic.go:334] "Generic (PLEG): container finished" podID="61384875-b5b9-4757-839c-2071e973510c" containerID="995341c81058f227c09a06b8a1ee48b56cbe4c4057c9ef92186c5fce49c9928d" exitCode=0 Mar 20 00:11:00 crc kubenswrapper[4867]: I0320 00:11:00.770768 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsj5n" event={"ID":"61384875-b5b9-4757-839c-2071e973510c","Type":"ContainerDied","Data":"995341c81058f227c09a06b8a1ee48b56cbe4c4057c9ef92186c5fce49c9928d"} Mar 20 00:11:00 crc kubenswrapper[4867]: I0320 00:11:00.777139 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjkcm" event={"ID":"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9","Type":"ContainerStarted","Data":"d13bea056258fe86c712ef6c22999bb0fa33cf8e1c2ba47cd3a1abfd4ba9bea4"} Mar 20 00:11:00 crc kubenswrapper[4867]: I0320 00:11:00.782840 4867 generic.go:334] "Generic (PLEG): container finished" podID="a24a1654-a84e-4b71-9209-dfd7e42941d0" containerID="3f045cf2db804dcb6448b014fe35b4de43689360eeef7fa729cb6c82e89a7442" exitCode=0 Mar 20 00:11:00 crc kubenswrapper[4867]: I0320 00:11:00.782916 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-68h4b" event={"ID":"a24a1654-a84e-4b71-9209-dfd7e42941d0","Type":"ContainerDied","Data":"3f045cf2db804dcb6448b014fe35b4de43689360eeef7fa729cb6c82e89a7442"} Mar 20 00:11:00 crc kubenswrapper[4867]: I0320 00:11:00.790937 4867 generic.go:334] "Generic (PLEG): container finished" podID="e04729fe-8c2e-4265-aa19-5717138937bd" containerID="5c53ffae0764e9734e2d38babe9264fd112089898da8cf78c3a6cd25cbdc9622" exitCode=0 Mar 20 00:11:00 crc kubenswrapper[4867]: I0320 00:11:00.790974 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-992sr" event={"ID":"e04729fe-8c2e-4265-aa19-5717138937bd","Type":"ContainerDied","Data":"5c53ffae0764e9734e2d38babe9264fd112089898da8cf78c3a6cd25cbdc9622"} Mar 20 00:11:01 crc kubenswrapper[4867]: E0320 00:11:01.139752 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad: Manifest does not match provided manifest digest sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad" Mar 20 00:11:01 crc kubenswrapper[4867]: E0320 00:11:01.139985 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad,Command:[/bin/opm],Args:[serve /extracted-catalog/catalog --cache-dir=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOMEMLIMIT,Value:120MiB,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{125829120 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s5v5h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-fsj5n_openshift-marketplace(61384875-b5b9-4757-839c-2071e973510c): ErrImagePull: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad: Manifest does not match provided manifest digest sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad" logger="UnhandledError" Mar 20 00:11:01 crc kubenswrapper[4867]: E0320 00:11:01.142087 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad: Manifest does not match provided manifest digest sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\"" pod="openshift-marketplace/community-operators-fsj5n" podUID="61384875-b5b9-4757-839c-2071e973510c" Mar 20 00:11:01 crc kubenswrapper[4867]: I0320 00:11:01.295924 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bwpfb"] Mar 20 00:11:01 crc kubenswrapper[4867]: I0320 00:11:01.797286 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-68h4b" event={"ID":"a24a1654-a84e-4b71-9209-dfd7e42941d0","Type":"ContainerStarted","Data":"3e6116bce91ea4d5ba1dbc4e56958654b2f2bb1427084d74f88ec782d9ca9d46"} Mar 20 00:11:01 crc kubenswrapper[4867]: I0320 00:11:01.799139 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-992sr" event={"ID":"e04729fe-8c2e-4265-aa19-5717138937bd","Type":"ContainerStarted","Data":"72a72e438bc28d5166d97520e97f147ed03fdf413f3d0e7bc88e780e18e186b4"} Mar 20 00:11:01 crc kubenswrapper[4867]: I0320 00:11:01.800816 4867 generic.go:334] "Generic (PLEG): container finished" podID="8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9" containerID="d13bea056258fe86c712ef6c22999bb0fa33cf8e1c2ba47cd3a1abfd4ba9bea4" exitCode=0 Mar 20 00:11:01 crc kubenswrapper[4867]: I0320 00:11:01.800892 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjkcm" event={"ID":"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9","Type":"ContainerDied","Data":"d13bea056258fe86c712ef6c22999bb0fa33cf8e1c2ba47cd3a1abfd4ba9bea4"} Mar 20 00:11:01 crc kubenswrapper[4867]: I0320 00:11:01.802214 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pj4c8" event={"ID":"33637ed0-f47f-4095-b742-dd29244de21c","Type":"ContainerStarted","Data":"7b5d7a679ce794783a7e6523e2c84166dc70131ebf56172c5beef7242b83d5ba"} Mar 20 00:11:01 crc kubenswrapper[4867]: I0320 00:11:01.842528 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-68h4b" podStartSLOduration=2.584799224 podStartE2EDuration="52.842511188s" podCreationTimestamp="2026-03-20 00:10:09 +0000 UTC" firstStartedPulling="2026-03-20 00:10:11.139676572 +0000 UTC m=+225.366214079" lastFinishedPulling="2026-03-20 00:11:01.397388526 +0000 UTC m=+275.623926043" observedRunningTime="2026-03-20 00:11:01.823848504 +0000 UTC m=+276.050386021" watchObservedRunningTime="2026-03-20 00:11:01.842511188 +0000 UTC m=+276.069048705" Mar 20 00:11:01 crc kubenswrapper[4867]: I0320 00:11:01.862454 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-992sr" podStartSLOduration=3.8012743650000003 podStartE2EDuration="49.862440034s" podCreationTimestamp="2026-03-20 00:10:12 +0000 UTC" firstStartedPulling="2026-03-20 00:10:15.309119575 +0000 UTC m=+229.535657092" lastFinishedPulling="2026-03-20 00:11:01.370285244 +0000 UTC m=+275.596822761" observedRunningTime="2026-03-20 00:11:01.8603553 +0000 UTC m=+276.086892817" watchObservedRunningTime="2026-03-20 00:11:01.862440034 +0000 UTC m=+276.088977551" Mar 20 00:11:02 crc kubenswrapper[4867]: I0320 00:11:02.564768 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mkf6s" Mar 20 00:11:02 crc kubenswrapper[4867]: I0320 00:11:02.599808 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mkf6s" Mar 20 00:11:02 crc kubenswrapper[4867]: I0320 00:11:02.809258 4867 generic.go:334] "Generic (PLEG): container finished" podID="33637ed0-f47f-4095-b742-dd29244de21c" containerID="7b5d7a679ce794783a7e6523e2c84166dc70131ebf56172c5beef7242b83d5ba" exitCode=0 Mar 20 00:11:02 crc kubenswrapper[4867]: I0320 00:11:02.809307 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pj4c8" event={"ID":"33637ed0-f47f-4095-b742-dd29244de21c","Type":"ContainerDied","Data":"7b5d7a679ce794783a7e6523e2c84166dc70131ebf56172c5beef7242b83d5ba"} Mar 20 00:11:02 crc kubenswrapper[4867]: I0320 00:11:02.812631 4867 generic.go:334] "Generic (PLEG): container finished" podID="38f94e88-d9e0-41a6-9dfe-390f2d709596" containerID="d6fa3dcc5f70892a3a02a892b0b17e5d749089426b3e8b914f2345ab6a3cf301" exitCode=0 Mar 20 00:11:02 crc kubenswrapper[4867]: I0320 00:11:02.812717 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxxsq" event={"ID":"38f94e88-d9e0-41a6-9dfe-390f2d709596","Type":"ContainerDied","Data":"d6fa3dcc5f70892a3a02a892b0b17e5d749089426b3e8b914f2345ab6a3cf301"} Mar 20 00:11:02 crc kubenswrapper[4867]: I0320 00:11:02.825814 4867 generic.go:334] "Generic (PLEG): container finished" podID="58c1b369-ec0e-43d0-a3ca-8c2b7b74d337" containerID="7dbf9eddb496d05913c8f26c5ff8de0053f76c8baf9d3e1cac5cb45fce6ea95a" exitCode=0 Mar 20 00:11:02 crc kubenswrapper[4867]: I0320 00:11:02.825963 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9sbs" event={"ID":"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337","Type":"ContainerDied","Data":"7dbf9eddb496d05913c8f26c5ff8de0053f76c8baf9d3e1cac5cb45fce6ea95a"} Mar 20 00:11:02 crc kubenswrapper[4867]: I0320 00:11:02.829638 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjkcm" event={"ID":"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9","Type":"ContainerStarted","Data":"caecc798f336bb322c4ba6033a10318683802d22cf0bc335fda8355bf0e545f6"} Mar 20 00:11:02 crc kubenswrapper[4867]: I0320 00:11:02.873936 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kjkcm" podStartSLOduration=2.660810385 podStartE2EDuration="51.873919796s" podCreationTimestamp="2026-03-20 00:10:11 +0000 UTC" firstStartedPulling="2026-03-20 00:10:13.261748687 +0000 UTC m=+227.488286204" lastFinishedPulling="2026-03-20 00:11:02.474858098 +0000 UTC m=+276.701395615" observedRunningTime="2026-03-20 00:11:02.855817437 +0000 UTC m=+277.082354964" watchObservedRunningTime="2026-03-20 00:11:02.873919796 +0000 UTC m=+277.100457313" Mar 20 00:11:03 crc kubenswrapper[4867]: I0320 00:11:03.056100 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-992sr" Mar 20 00:11:03 crc kubenswrapper[4867]: I0320 00:11:03.056137 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-992sr" Mar 20 00:11:03 crc kubenswrapper[4867]: I0320 00:11:03.835738 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9sbs" event={"ID":"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337","Type":"ContainerStarted","Data":"cde655d824a83bb005be7983641b5234c7f9c75b60511b9babdf303ecbad22cd"} Mar 20 00:11:03 crc kubenswrapper[4867]: I0320 00:11:03.838292 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pj4c8" event={"ID":"33637ed0-f47f-4095-b742-dd29244de21c","Type":"ContainerStarted","Data":"a912da23bceb8a038f0cbb81f14b209028cd57a88e082041f54aa18c64db739d"} Mar 20 00:11:03 crc kubenswrapper[4867]: I0320 00:11:03.840204 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxxsq" event={"ID":"38f94e88-d9e0-41a6-9dfe-390f2d709596","Type":"ContainerStarted","Data":"761d30edb4777d50a5308c20dbb4c9d8081a5d5e8ac084dc32425688ef885da7"} Mar 20 00:11:03 crc kubenswrapper[4867]: I0320 00:11:03.855318 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w9sbs" podStartSLOduration=2.591541898 podStartE2EDuration="55.85530085s" podCreationTimestamp="2026-03-20 00:10:08 +0000 UTC" firstStartedPulling="2026-03-20 00:10:10.099692456 +0000 UTC m=+224.326229973" lastFinishedPulling="2026-03-20 00:11:03.363451418 +0000 UTC m=+277.589988925" observedRunningTime="2026-03-20 00:11:03.853719259 +0000 UTC m=+278.080256776" watchObservedRunningTime="2026-03-20 00:11:03.85530085 +0000 UTC m=+278.081838367" Mar 20 00:11:03 crc kubenswrapper[4867]: I0320 00:11:03.879725 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bxxsq" podStartSLOduration=2.724368873 podStartE2EDuration="54.879707302s" podCreationTimestamp="2026-03-20 00:10:09 +0000 UTC" firstStartedPulling="2026-03-20 00:10:11.149369357 +0000 UTC m=+225.375906874" lastFinishedPulling="2026-03-20 00:11:03.304707786 +0000 UTC m=+277.531245303" observedRunningTime="2026-03-20 00:11:03.876636533 +0000 UTC m=+278.103174050" watchObservedRunningTime="2026-03-20 00:11:03.879707302 +0000 UTC m=+278.106244819" Mar 20 00:11:03 crc kubenswrapper[4867]: I0320 00:11:03.891457 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pj4c8" podStartSLOduration=2.762999259 podStartE2EDuration="52.891446987s" podCreationTimestamp="2026-03-20 00:10:11 +0000 UTC" firstStartedPulling="2026-03-20 00:10:13.267255946 +0000 UTC m=+227.493793463" lastFinishedPulling="2026-03-20 00:11:03.395703674 +0000 UTC m=+277.622241191" observedRunningTime="2026-03-20 00:11:03.891218281 +0000 UTC m=+278.117755798" watchObservedRunningTime="2026-03-20 00:11:03.891446987 +0000 UTC m=+278.117984504" Mar 20 00:11:04 crc kubenswrapper[4867]: I0320 00:11:04.090836 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-992sr" podUID="e04729fe-8c2e-4265-aa19-5717138937bd" containerName="registry-server" probeResult="failure" output=< Mar 20 00:11:04 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Mar 20 00:11:04 crc kubenswrapper[4867]: > Mar 20 00:11:05 crc kubenswrapper[4867]: I0320 00:11:05.902043 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cc6595674-hm274"] Mar 20 00:11:07 crc kubenswrapper[4867]: I0320 00:11:05.902290 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" podUID="9f0c542d-82d1-4dd4-9607-05ab0325e90e" containerName="controller-manager" containerID="cri-o://6447f6d467b2431cdd529f6d0c4f65c688a1ab7af2bfb35cbb3af64c3ab2e454" gracePeriod=30 Mar 20 00:11:07 crc kubenswrapper[4867]: I0320 00:11:05.925448 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5"] Mar 20 00:11:07 crc kubenswrapper[4867]: I0320 00:11:05.926004 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" podUID="1b79faf1-1e7c-46c4-a0ba-f452920d748d" containerName="route-controller-manager" containerID="cri-o://5694c7ce8166c0091bf9ed6c7d510f664e73c1f2895655fb085ffc6e5d332e62" gracePeriod=30 Mar 20 00:11:07 crc kubenswrapper[4867]: I0320 00:11:07.861947 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f0c542d-82d1-4dd4-9607-05ab0325e90e" containerID="6447f6d467b2431cdd529f6d0c4f65c688a1ab7af2bfb35cbb3af64c3ab2e454" exitCode=0 Mar 20 00:11:07 crc kubenswrapper[4867]: I0320 00:11:07.862029 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" event={"ID":"9f0c542d-82d1-4dd4-9607-05ab0325e90e","Type":"ContainerDied","Data":"6447f6d467b2431cdd529f6d0c4f65c688a1ab7af2bfb35cbb3af64c3ab2e454"} Mar 20 00:11:07 crc kubenswrapper[4867]: I0320 00:11:07.863357 4867 generic.go:334] "Generic (PLEG): container finished" podID="1b79faf1-1e7c-46c4-a0ba-f452920d748d" containerID="5694c7ce8166c0091bf9ed6c7d510f664e73c1f2895655fb085ffc6e5d332e62" exitCode=0 Mar 20 00:11:07 crc kubenswrapper[4867]: I0320 00:11:07.863389 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" event={"ID":"1b79faf1-1e7c-46c4-a0ba-f452920d748d","Type":"ContainerDied","Data":"5694c7ce8166c0091bf9ed6c7d510f664e73c1f2895655fb085ffc6e5d332e62"} Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.004471 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.008274 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.031905 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c8fcf5668-8cxsr"] Mar 20 00:11:09 crc kubenswrapper[4867]: E0320 00:11:09.032110 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0c542d-82d1-4dd4-9607-05ab0325e90e" containerName="controller-manager" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.032123 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0c542d-82d1-4dd4-9607-05ab0325e90e" containerName="controller-manager" Mar 20 00:11:09 crc kubenswrapper[4867]: E0320 00:11:09.032131 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b1008c-a714-49a2-8c17-e971afc302af" containerName="oc" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.032136 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b1008c-a714-49a2-8c17-e971afc302af" containerName="oc" Mar 20 00:11:09 crc kubenswrapper[4867]: E0320 00:11:09.032147 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91db560-3692-4697-bbf0-3c5a8438d5e5" containerName="oc" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.032153 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91db560-3692-4697-bbf0-3c5a8438d5e5" containerName="oc" Mar 20 00:11:09 crc kubenswrapper[4867]: E0320 00:11:09.032164 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b79faf1-1e7c-46c4-a0ba-f452920d748d" containerName="route-controller-manager" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.032170 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b79faf1-1e7c-46c4-a0ba-f452920d748d" containerName="route-controller-manager" Mar 20 00:11:09 crc kubenswrapper[4867]: E0320 00:11:09.032180 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943856ce-59c3-4e9d-8b34-9053717caf7f" containerName="pruner" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.032185 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="943856ce-59c3-4e9d-8b34-9053717caf7f" containerName="pruner" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.032272 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="943856ce-59c3-4e9d-8b34-9053717caf7f" containerName="pruner" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.032284 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e91db560-3692-4697-bbf0-3c5a8438d5e5" containerName="oc" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.032292 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0c542d-82d1-4dd4-9607-05ab0325e90e" containerName="controller-manager" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.032299 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b79faf1-1e7c-46c4-a0ba-f452920d748d" containerName="route-controller-manager" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.032307 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4b1008c-a714-49a2-8c17-e971afc302af" containerName="oc" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.032699 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.043052 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c8fcf5668-8cxsr"] Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.095637 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc52j\" (UniqueName: \"kubernetes.io/projected/1b79faf1-1e7c-46c4-a0ba-f452920d748d-kube-api-access-cc52j\") pod \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\" (UID: \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\") " Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.095678 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0c542d-82d1-4dd4-9607-05ab0325e90e-serving-cert\") pod \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.095704 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b79faf1-1e7c-46c4-a0ba-f452920d748d-config\") pod \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\" (UID: \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\") " Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.095740 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0c542d-82d1-4dd4-9607-05ab0325e90e-config\") pod \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.095791 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b79faf1-1e7c-46c4-a0ba-f452920d748d-client-ca\") pod \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\" (UID: \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\") " Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.095828 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b79faf1-1e7c-46c4-a0ba-f452920d748d-serving-cert\") pod \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\" (UID: \"1b79faf1-1e7c-46c4-a0ba-f452920d748d\") " Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.095855 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f0c542d-82d1-4dd4-9607-05ab0325e90e-proxy-ca-bundles\") pod \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.095888 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0c542d-82d1-4dd4-9607-05ab0325e90e-client-ca\") pod \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.095948 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz6cj\" (UniqueName: \"kubernetes.io/projected/9f0c542d-82d1-4dd4-9607-05ab0325e90e-kube-api-access-jz6cj\") pod \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\" (UID: \"9f0c542d-82d1-4dd4-9607-05ab0325e90e\") " Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.096422 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b79faf1-1e7c-46c4-a0ba-f452920d748d-client-ca" (OuterVolumeSpecName: "client-ca") pod "1b79faf1-1e7c-46c4-a0ba-f452920d748d" (UID: "1b79faf1-1e7c-46c4-a0ba-f452920d748d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.096511 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b79faf1-1e7c-46c4-a0ba-f452920d748d-config" (OuterVolumeSpecName: "config") pod "1b79faf1-1e7c-46c4-a0ba-f452920d748d" (UID: "1b79faf1-1e7c-46c4-a0ba-f452920d748d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.096723 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f0c542d-82d1-4dd4-9607-05ab0325e90e-config" (OuterVolumeSpecName: "config") pod "9f0c542d-82d1-4dd4-9607-05ab0325e90e" (UID: "9f0c542d-82d1-4dd4-9607-05ab0325e90e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.096807 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f0c542d-82d1-4dd4-9607-05ab0325e90e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9f0c542d-82d1-4dd4-9607-05ab0325e90e" (UID: "9f0c542d-82d1-4dd4-9607-05ab0325e90e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.096989 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f0c542d-82d1-4dd4-9607-05ab0325e90e-client-ca" (OuterVolumeSpecName: "client-ca") pod "9f0c542d-82d1-4dd4-9607-05ab0325e90e" (UID: "9f0c542d-82d1-4dd4-9607-05ab0325e90e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.097467 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12cd4758-795a-431c-ab92-b3156cfeb6e7-serving-cert\") pod \"controller-manager-c8fcf5668-8cxsr\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.097573 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4s4x\" (UniqueName: \"kubernetes.io/projected/12cd4758-795a-431c-ab92-b3156cfeb6e7-kube-api-access-z4s4x\") pod \"controller-manager-c8fcf5668-8cxsr\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.097604 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12cd4758-795a-431c-ab92-b3156cfeb6e7-proxy-ca-bundles\") pod \"controller-manager-c8fcf5668-8cxsr\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.097656 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12cd4758-795a-431c-ab92-b3156cfeb6e7-config\") pod \"controller-manager-c8fcf5668-8cxsr\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.097688 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12cd4758-795a-431c-ab92-b3156cfeb6e7-client-ca\") pod \"controller-manager-c8fcf5668-8cxsr\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.097919 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b79faf1-1e7c-46c4-a0ba-f452920d748d-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.097941 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0c542d-82d1-4dd4-9607-05ab0325e90e-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.097954 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b79faf1-1e7c-46c4-a0ba-f452920d748d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.097967 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f0c542d-82d1-4dd4-9607-05ab0325e90e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.097981 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0c542d-82d1-4dd4-9607-05ab0325e90e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.100980 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f0c542d-82d1-4dd4-9607-05ab0325e90e-kube-api-access-jz6cj" (OuterVolumeSpecName: "kube-api-access-jz6cj") pod "9f0c542d-82d1-4dd4-9607-05ab0325e90e" (UID: "9f0c542d-82d1-4dd4-9607-05ab0325e90e"). InnerVolumeSpecName "kube-api-access-jz6cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.101796 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f0c542d-82d1-4dd4-9607-05ab0325e90e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9f0c542d-82d1-4dd4-9607-05ab0325e90e" (UID: "9f0c542d-82d1-4dd4-9607-05ab0325e90e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.104562 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b79faf1-1e7c-46c4-a0ba-f452920d748d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1b79faf1-1e7c-46c4-a0ba-f452920d748d" (UID: "1b79faf1-1e7c-46c4-a0ba-f452920d748d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.109978 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b79faf1-1e7c-46c4-a0ba-f452920d748d-kube-api-access-cc52j" (OuterVolumeSpecName: "kube-api-access-cc52j") pod "1b79faf1-1e7c-46c4-a0ba-f452920d748d" (UID: "1b79faf1-1e7c-46c4-a0ba-f452920d748d"). InnerVolumeSpecName "kube-api-access-cc52j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.199725 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12cd4758-795a-431c-ab92-b3156cfeb6e7-serving-cert\") pod \"controller-manager-c8fcf5668-8cxsr\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.199856 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4s4x\" (UniqueName: \"kubernetes.io/projected/12cd4758-795a-431c-ab92-b3156cfeb6e7-kube-api-access-z4s4x\") pod \"controller-manager-c8fcf5668-8cxsr\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.199897 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12cd4758-795a-431c-ab92-b3156cfeb6e7-proxy-ca-bundles\") pod \"controller-manager-c8fcf5668-8cxsr\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.199935 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12cd4758-795a-431c-ab92-b3156cfeb6e7-config\") pod \"controller-manager-c8fcf5668-8cxsr\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.199955 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12cd4758-795a-431c-ab92-b3156cfeb6e7-client-ca\") pod \"controller-manager-c8fcf5668-8cxsr\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.201098 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc52j\" (UniqueName: \"kubernetes.io/projected/1b79faf1-1e7c-46c4-a0ba-f452920d748d-kube-api-access-cc52j\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.201374 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0c542d-82d1-4dd4-9607-05ab0325e90e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.201926 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b79faf1-1e7c-46c4-a0ba-f452920d748d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.201942 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz6cj\" (UniqueName: \"kubernetes.io/projected/9f0c542d-82d1-4dd4-9607-05ab0325e90e-kube-api-access-jz6cj\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.203274 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12cd4758-795a-431c-ab92-b3156cfeb6e7-proxy-ca-bundles\") pod \"controller-manager-c8fcf5668-8cxsr\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.204787 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12cd4758-795a-431c-ab92-b3156cfeb6e7-config\") pod \"controller-manager-c8fcf5668-8cxsr\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.208653 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12cd4758-795a-431c-ab92-b3156cfeb6e7-serving-cert\") pod \"controller-manager-c8fcf5668-8cxsr\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.208823 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12cd4758-795a-431c-ab92-b3156cfeb6e7-client-ca\") pod \"controller-manager-c8fcf5668-8cxsr\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.215744 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4s4x\" (UniqueName: \"kubernetes.io/projected/12cd4758-795a-431c-ab92-b3156cfeb6e7-kube-api-access-z4s4x\") pod \"controller-manager-c8fcf5668-8cxsr\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.334406 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w9sbs" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.334480 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w9sbs" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.355607 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.375371 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w9sbs" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.735707 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bxxsq" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.735757 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bxxsq" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.751691 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c8fcf5668-8cxsr"] Mar 20 00:11:09 crc kubenswrapper[4867]: W0320 00:11:09.758239 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12cd4758_795a_431c_ab92_b3156cfeb6e7.slice/crio-7458a5605845b8a95f6c6cf134a78f80051d036a4174dd5fd0b29d1ad54ca2d5 WatchSource:0}: Error finding container 7458a5605845b8a95f6c6cf134a78f80051d036a4174dd5fd0b29d1ad54ca2d5: Status 404 returned error can't find the container with id 7458a5605845b8a95f6c6cf134a78f80051d036a4174dd5fd0b29d1ad54ca2d5 Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.772281 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bxxsq" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.886121 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" event={"ID":"12cd4758-795a-431c-ab92-b3156cfeb6e7","Type":"ContainerStarted","Data":"7458a5605845b8a95f6c6cf134a78f80051d036a4174dd5fd0b29d1ad54ca2d5"} Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.887951 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" event={"ID":"9f0c542d-82d1-4dd4-9607-05ab0325e90e","Type":"ContainerDied","Data":"5392f18b1be81b5e2664be06430494dfeae427d4358532b095fd1645d56cb15e"} Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.887970 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc6595674-hm274" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.887992 4867 scope.go:117] "RemoveContainer" containerID="6447f6d467b2431cdd529f6d0c4f65c688a1ab7af2bfb35cbb3af64c3ab2e454" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.891019 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.891889 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5" event={"ID":"1b79faf1-1e7c-46c4-a0ba-f452920d748d","Type":"ContainerDied","Data":"df08fd4cb52d33cdc4ff6a87d2dc340cbeaf832f3fb5c886c207cb8858583d5f"} Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.904350 4867 scope.go:117] "RemoveContainer" containerID="5694c7ce8166c0091bf9ed6c7d510f664e73c1f2895655fb085ffc6e5d332e62" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.917711 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cc6595674-hm274"] Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.921576 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7cc6595674-hm274"] Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.934221 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5"] Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.934608 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-68h4b" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.934677 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-68h4b" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.934818 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w9sbs" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.937137 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bxxsq" Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.938185 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-549984b657-jnxb5"] Mar 20 00:11:09 crc kubenswrapper[4867]: I0320 00:11:09.980619 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-68h4b" Mar 20 00:11:10 crc kubenswrapper[4867]: I0320 00:11:10.427781 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b79faf1-1e7c-46c4-a0ba-f452920d748d" path="/var/lib/kubelet/pods/1b79faf1-1e7c-46c4-a0ba-f452920d748d/volumes" Mar 20 00:11:10 crc kubenswrapper[4867]: I0320 00:11:10.428268 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f0c542d-82d1-4dd4-9607-05ab0325e90e" path="/var/lib/kubelet/pods/9f0c542d-82d1-4dd4-9607-05ab0325e90e/volumes" Mar 20 00:11:10 crc kubenswrapper[4867]: I0320 00:11:10.897256 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" event={"ID":"12cd4758-795a-431c-ab92-b3156cfeb6e7","Type":"ContainerStarted","Data":"f65dc7b68a5c7a7edd2dd8789d7981492b8766bf41d00ae3a64705f3b5fe4436"} Mar 20 00:11:10 crc kubenswrapper[4867]: I0320 00:11:10.937095 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-68h4b" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.527562 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kjkcm" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.527981 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kjkcm" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.554362 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8"] Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.554984 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.557310 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.557589 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.558036 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.558188 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.558403 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.560966 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.570430 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8"] Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.610667 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kjkcm" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.636026 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b06883-a7a1-473b-bb3b-e8651ce58fae-config\") pod \"route-controller-manager-64f4c9dfb4-jnhm8\" (UID: \"21b06883-a7a1-473b-bb3b-e8651ce58fae\") " pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.636108 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21b06883-a7a1-473b-bb3b-e8651ce58fae-client-ca\") pod \"route-controller-manager-64f4c9dfb4-jnhm8\" (UID: \"21b06883-a7a1-473b-bb3b-e8651ce58fae\") " pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.636190 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21b06883-a7a1-473b-bb3b-e8651ce58fae-serving-cert\") pod \"route-controller-manager-64f4c9dfb4-jnhm8\" (UID: \"21b06883-a7a1-473b-bb3b-e8651ce58fae\") " pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.636227 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8l88\" (UniqueName: \"kubernetes.io/projected/21b06883-a7a1-473b-bb3b-e8651ce58fae-kube-api-access-f8l88\") pod \"route-controller-manager-64f4c9dfb4-jnhm8\" (UID: \"21b06883-a7a1-473b-bb3b-e8651ce58fae\") " pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.658827 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bxxsq"] Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.737683 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21b06883-a7a1-473b-bb3b-e8651ce58fae-serving-cert\") pod \"route-controller-manager-64f4c9dfb4-jnhm8\" (UID: \"21b06883-a7a1-473b-bb3b-e8651ce58fae\") " pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.737736 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8l88\" (UniqueName: \"kubernetes.io/projected/21b06883-a7a1-473b-bb3b-e8651ce58fae-kube-api-access-f8l88\") pod \"route-controller-manager-64f4c9dfb4-jnhm8\" (UID: \"21b06883-a7a1-473b-bb3b-e8651ce58fae\") " pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.737794 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b06883-a7a1-473b-bb3b-e8651ce58fae-config\") pod \"route-controller-manager-64f4c9dfb4-jnhm8\" (UID: \"21b06883-a7a1-473b-bb3b-e8651ce58fae\") " pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.737827 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21b06883-a7a1-473b-bb3b-e8651ce58fae-client-ca\") pod \"route-controller-manager-64f4c9dfb4-jnhm8\" (UID: \"21b06883-a7a1-473b-bb3b-e8651ce58fae\") " pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.738699 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21b06883-a7a1-473b-bb3b-e8651ce58fae-client-ca\") pod \"route-controller-manager-64f4c9dfb4-jnhm8\" (UID: \"21b06883-a7a1-473b-bb3b-e8651ce58fae\") " pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.739074 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b06883-a7a1-473b-bb3b-e8651ce58fae-config\") pod \"route-controller-manager-64f4c9dfb4-jnhm8\" (UID: \"21b06883-a7a1-473b-bb3b-e8651ce58fae\") " pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.745686 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21b06883-a7a1-473b-bb3b-e8651ce58fae-serving-cert\") pod \"route-controller-manager-64f4c9dfb4-jnhm8\" (UID: \"21b06883-a7a1-473b-bb3b-e8651ce58fae\") " pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.760094 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8l88\" (UniqueName: \"kubernetes.io/projected/21b06883-a7a1-473b-bb3b-e8651ce58fae-kube-api-access-f8l88\") pod \"route-controller-manager-64f4c9dfb4-jnhm8\" (UID: \"21b06883-a7a1-473b-bb3b-e8651ce58fae\") " pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.852119 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-68h4b"] Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.891275 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pj4c8" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.891471 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pj4c8" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.914530 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bxxsq" podUID="38f94e88-d9e0-41a6-9dfe-390f2d709596" containerName="registry-server" containerID="cri-o://761d30edb4777d50a5308c20dbb4c9d8081a5d5e8ac084dc32425688ef885da7" gracePeriod=2 Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.917511 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.946162 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pj4c8" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.948392 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" podStartSLOduration=6.948369628 podStartE2EDuration="6.948369628s" podCreationTimestamp="2026-03-20 00:11:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:11:11.937277391 +0000 UTC m=+286.163814918" watchObservedRunningTime="2026-03-20 00:11:11.948369628 +0000 UTC m=+286.174907145" Mar 20 00:11:11 crc kubenswrapper[4867]: I0320 00:11:11.969373 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kjkcm" Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.362457 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8"] Mar 20 00:11:12 crc kubenswrapper[4867]: W0320 00:11:12.372784 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21b06883_a7a1_473b_bb3b_e8651ce58fae.slice/crio-15037152aaa585d21793dde83eed89f5a170e4917ccccde1faf853c8deb44998 WatchSource:0}: Error finding container 15037152aaa585d21793dde83eed89f5a170e4917ccccde1faf853c8deb44998: Status 404 returned error can't find the container with id 15037152aaa585d21793dde83eed89f5a170e4917ccccde1faf853c8deb44998 Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.815753 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bxxsq" Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.920918 4867 generic.go:334] "Generic (PLEG): container finished" podID="38f94e88-d9e0-41a6-9dfe-390f2d709596" containerID="761d30edb4777d50a5308c20dbb4c9d8081a5d5e8ac084dc32425688ef885da7" exitCode=0 Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.920982 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxxsq" event={"ID":"38f94e88-d9e0-41a6-9dfe-390f2d709596","Type":"ContainerDied","Data":"761d30edb4777d50a5308c20dbb4c9d8081a5d5e8ac084dc32425688ef885da7"} Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.921008 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxxsq" event={"ID":"38f94e88-d9e0-41a6-9dfe-390f2d709596","Type":"ContainerDied","Data":"2f7805c4389bf9d61f62e9a561593a7c00385379ffcb7de6b7000182625df4cc"} Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.921024 4867 scope.go:117] "RemoveContainer" containerID="761d30edb4777d50a5308c20dbb4c9d8081a5d5e8ac084dc32425688ef885da7" Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.921038 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bxxsq" Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.923572 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" event={"ID":"21b06883-a7a1-473b-bb3b-e8651ce58fae","Type":"ContainerStarted","Data":"f39eef7722624e1cba330d288d1550e002097dba1c97096547821b468472fa08"} Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.923625 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" event={"ID":"21b06883-a7a1-473b-bb3b-e8651ce58fae","Type":"ContainerStarted","Data":"15037152aaa585d21793dde83eed89f5a170e4917ccccde1faf853c8deb44998"} Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.923712 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-68h4b" podUID="a24a1654-a84e-4b71-9209-dfd7e42941d0" containerName="registry-server" containerID="cri-o://3e6116bce91ea4d5ba1dbc4e56958654b2f2bb1427084d74f88ec782d9ca9d46" gracePeriod=2 Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.943675 4867 scope.go:117] "RemoveContainer" containerID="d6fa3dcc5f70892a3a02a892b0b17e5d749089426b3e8b914f2345ab6a3cf301" Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.954096 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" podStartSLOduration=7.954073602 podStartE2EDuration="7.954073602s" podCreationTimestamp="2026-03-20 00:11:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:11:12.946072265 +0000 UTC m=+287.172609782" watchObservedRunningTime="2026-03-20 00:11:12.954073602 +0000 UTC m=+287.180611119" Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.965738 4867 scope.go:117] "RemoveContainer" containerID="3a97312c1170cf6b3cf0a4c52a1a40216422a6f4151834ce1fa7ae62c2977fb3" Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.971505 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38f94e88-d9e0-41a6-9dfe-390f2d709596-catalog-content\") pod \"38f94e88-d9e0-41a6-9dfe-390f2d709596\" (UID: \"38f94e88-d9e0-41a6-9dfe-390f2d709596\") " Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.971604 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhjrb\" (UniqueName: \"kubernetes.io/projected/38f94e88-d9e0-41a6-9dfe-390f2d709596-kube-api-access-lhjrb\") pod \"38f94e88-d9e0-41a6-9dfe-390f2d709596\" (UID: \"38f94e88-d9e0-41a6-9dfe-390f2d709596\") " Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.971753 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38f94e88-d9e0-41a6-9dfe-390f2d709596-utilities\") pod \"38f94e88-d9e0-41a6-9dfe-390f2d709596\" (UID: \"38f94e88-d9e0-41a6-9dfe-390f2d709596\") " Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.973253 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38f94e88-d9e0-41a6-9dfe-390f2d709596-utilities" (OuterVolumeSpecName: "utilities") pod "38f94e88-d9e0-41a6-9dfe-390f2d709596" (UID: "38f94e88-d9e0-41a6-9dfe-390f2d709596"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.982473 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f94e88-d9e0-41a6-9dfe-390f2d709596-kube-api-access-lhjrb" (OuterVolumeSpecName: "kube-api-access-lhjrb") pod "38f94e88-d9e0-41a6-9dfe-390f2d709596" (UID: "38f94e88-d9e0-41a6-9dfe-390f2d709596"). InnerVolumeSpecName "kube-api-access-lhjrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.991769 4867 scope.go:117] "RemoveContainer" containerID="761d30edb4777d50a5308c20dbb4c9d8081a5d5e8ac084dc32425688ef885da7" Mar 20 00:11:12 crc kubenswrapper[4867]: E0320 00:11:12.992391 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"761d30edb4777d50a5308c20dbb4c9d8081a5d5e8ac084dc32425688ef885da7\": container with ID starting with 761d30edb4777d50a5308c20dbb4c9d8081a5d5e8ac084dc32425688ef885da7 not found: ID does not exist" containerID="761d30edb4777d50a5308c20dbb4c9d8081a5d5e8ac084dc32425688ef885da7" Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.992475 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"761d30edb4777d50a5308c20dbb4c9d8081a5d5e8ac084dc32425688ef885da7"} err="failed to get container status \"761d30edb4777d50a5308c20dbb4c9d8081a5d5e8ac084dc32425688ef885da7\": rpc error: code = NotFound desc = could not find container \"761d30edb4777d50a5308c20dbb4c9d8081a5d5e8ac084dc32425688ef885da7\": container with ID starting with 761d30edb4777d50a5308c20dbb4c9d8081a5d5e8ac084dc32425688ef885da7 not found: ID does not exist" Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.992538 4867 scope.go:117] "RemoveContainer" containerID="d6fa3dcc5f70892a3a02a892b0b17e5d749089426b3e8b914f2345ab6a3cf301" Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.992896 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pj4c8" Mar 20 00:11:12 crc kubenswrapper[4867]: E0320 00:11:12.993264 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6fa3dcc5f70892a3a02a892b0b17e5d749089426b3e8b914f2345ab6a3cf301\": container with ID starting with d6fa3dcc5f70892a3a02a892b0b17e5d749089426b3e8b914f2345ab6a3cf301 not found: ID does not exist" containerID="d6fa3dcc5f70892a3a02a892b0b17e5d749089426b3e8b914f2345ab6a3cf301" Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.993293 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6fa3dcc5f70892a3a02a892b0b17e5d749089426b3e8b914f2345ab6a3cf301"} err="failed to get container status \"d6fa3dcc5f70892a3a02a892b0b17e5d749089426b3e8b914f2345ab6a3cf301\": rpc error: code = NotFound desc = could not find container \"d6fa3dcc5f70892a3a02a892b0b17e5d749089426b3e8b914f2345ab6a3cf301\": container with ID starting with d6fa3dcc5f70892a3a02a892b0b17e5d749089426b3e8b914f2345ab6a3cf301 not found: ID does not exist" Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.993339 4867 scope.go:117] "RemoveContainer" containerID="3a97312c1170cf6b3cf0a4c52a1a40216422a6f4151834ce1fa7ae62c2977fb3" Mar 20 00:11:12 crc kubenswrapper[4867]: E0320 00:11:12.993557 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a97312c1170cf6b3cf0a4c52a1a40216422a6f4151834ce1fa7ae62c2977fb3\": container with ID starting with 3a97312c1170cf6b3cf0a4c52a1a40216422a6f4151834ce1fa7ae62c2977fb3 not found: ID does not exist" containerID="3a97312c1170cf6b3cf0a4c52a1a40216422a6f4151834ce1fa7ae62c2977fb3" Mar 20 00:11:12 crc kubenswrapper[4867]: I0320 00:11:12.993609 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a97312c1170cf6b3cf0a4c52a1a40216422a6f4151834ce1fa7ae62c2977fb3"} err="failed to get container status \"3a97312c1170cf6b3cf0a4c52a1a40216422a6f4151834ce1fa7ae62c2977fb3\": rpc error: code = NotFound desc = could not find container \"3a97312c1170cf6b3cf0a4c52a1a40216422a6f4151834ce1fa7ae62c2977fb3\": container with ID starting with 3a97312c1170cf6b3cf0a4c52a1a40216422a6f4151834ce1fa7ae62c2977fb3 not found: ID does not exist" Mar 20 00:11:13 crc kubenswrapper[4867]: I0320 00:11:13.059148 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38f94e88-d9e0-41a6-9dfe-390f2d709596-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38f94e88-d9e0-41a6-9dfe-390f2d709596" (UID: "38f94e88-d9e0-41a6-9dfe-390f2d709596"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:11:13 crc kubenswrapper[4867]: I0320 00:11:13.073935 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38f94e88-d9e0-41a6-9dfe-390f2d709596-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:13 crc kubenswrapper[4867]: I0320 00:11:13.073994 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38f94e88-d9e0-41a6-9dfe-390f2d709596-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:13 crc kubenswrapper[4867]: I0320 00:11:13.074014 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhjrb\" (UniqueName: \"kubernetes.io/projected/38f94e88-d9e0-41a6-9dfe-390f2d709596-kube-api-access-lhjrb\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:13 crc kubenswrapper[4867]: I0320 00:11:13.098850 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-992sr" Mar 20 00:11:13 crc kubenswrapper[4867]: I0320 00:11:13.149814 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-992sr" Mar 20 00:11:13 crc kubenswrapper[4867]: I0320 00:11:13.256577 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bxxsq"] Mar 20 00:11:13 crc kubenswrapper[4867]: I0320 00:11:13.259151 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bxxsq"] Mar 20 00:11:13 crc kubenswrapper[4867]: I0320 00:11:13.932273 4867 generic.go:334] "Generic (PLEG): container finished" podID="a24a1654-a84e-4b71-9209-dfd7e42941d0" containerID="3e6116bce91ea4d5ba1dbc4e56958654b2f2bb1427084d74f88ec782d9ca9d46" exitCode=0 Mar 20 00:11:13 crc kubenswrapper[4867]: I0320 00:11:13.932358 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-68h4b" event={"ID":"a24a1654-a84e-4b71-9209-dfd7e42941d0","Type":"ContainerDied","Data":"3e6116bce91ea4d5ba1dbc4e56958654b2f2bb1427084d74f88ec782d9ca9d46"} Mar 20 00:11:13 crc kubenswrapper[4867]: I0320 00:11:13.933840 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" Mar 20 00:11:13 crc kubenswrapper[4867]: I0320 00:11:13.941464 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.055184 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pj4c8"] Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.188363 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-68h4b" Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.293141 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24a1654-a84e-4b71-9209-dfd7e42941d0-utilities\") pod \"a24a1654-a84e-4b71-9209-dfd7e42941d0\" (UID: \"a24a1654-a84e-4b71-9209-dfd7e42941d0\") " Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.293298 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-455wk\" (UniqueName: \"kubernetes.io/projected/a24a1654-a84e-4b71-9209-dfd7e42941d0-kube-api-access-455wk\") pod \"a24a1654-a84e-4b71-9209-dfd7e42941d0\" (UID: \"a24a1654-a84e-4b71-9209-dfd7e42941d0\") " Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.293359 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24a1654-a84e-4b71-9209-dfd7e42941d0-catalog-content\") pod \"a24a1654-a84e-4b71-9209-dfd7e42941d0\" (UID: \"a24a1654-a84e-4b71-9209-dfd7e42941d0\") " Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.293966 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a24a1654-a84e-4b71-9209-dfd7e42941d0-utilities" (OuterVolumeSpecName: "utilities") pod "a24a1654-a84e-4b71-9209-dfd7e42941d0" (UID: "a24a1654-a84e-4b71-9209-dfd7e42941d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.297947 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a24a1654-a84e-4b71-9209-dfd7e42941d0-kube-api-access-455wk" (OuterVolumeSpecName: "kube-api-access-455wk") pod "a24a1654-a84e-4b71-9209-dfd7e42941d0" (UID: "a24a1654-a84e-4b71-9209-dfd7e42941d0"). InnerVolumeSpecName "kube-api-access-455wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.365178 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a24a1654-a84e-4b71-9209-dfd7e42941d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a24a1654-a84e-4b71-9209-dfd7e42941d0" (UID: "a24a1654-a84e-4b71-9209-dfd7e42941d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.396197 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24a1654-a84e-4b71-9209-dfd7e42941d0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.396272 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-455wk\" (UniqueName: \"kubernetes.io/projected/a24a1654-a84e-4b71-9209-dfd7e42941d0-kube-api-access-455wk\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.396299 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24a1654-a84e-4b71-9209-dfd7e42941d0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.446962 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f94e88-d9e0-41a6-9dfe-390f2d709596" path="/var/lib/kubelet/pods/38f94e88-d9e0-41a6-9dfe-390f2d709596/volumes" Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.942866 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-68h4b" event={"ID":"a24a1654-a84e-4b71-9209-dfd7e42941d0","Type":"ContainerDied","Data":"665fc31fadc8730f602d306c8115db9c1fddcfb53572464e756353342f85e16b"} Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.942978 4867 scope.go:117] "RemoveContainer" containerID="3e6116bce91ea4d5ba1dbc4e56958654b2f2bb1427084d74f88ec782d9ca9d46" Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.943082 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pj4c8" podUID="33637ed0-f47f-4095-b742-dd29244de21c" containerName="registry-server" containerID="cri-o://a912da23bceb8a038f0cbb81f14b209028cd57a88e082041f54aa18c64db739d" gracePeriod=2 Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.943150 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-68h4b" Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.964067 4867 scope.go:117] "RemoveContainer" containerID="3f045cf2db804dcb6448b014fe35b4de43689360eeef7fa729cb6c82e89a7442" Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.973238 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-68h4b"] Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.981024 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-68h4b"] Mar 20 00:11:14 crc kubenswrapper[4867]: I0320 00:11:14.985517 4867 scope.go:117] "RemoveContainer" containerID="56b0b16fdf0275108c3ef7bbe09f6499e1316c587ba3009bdb80979e63065f37" Mar 20 00:11:15 crc kubenswrapper[4867]: I0320 00:11:15.367725 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pj4c8" Mar 20 00:11:15 crc kubenswrapper[4867]: I0320 00:11:15.514450 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2qm5\" (UniqueName: \"kubernetes.io/projected/33637ed0-f47f-4095-b742-dd29244de21c-kube-api-access-f2qm5\") pod \"33637ed0-f47f-4095-b742-dd29244de21c\" (UID: \"33637ed0-f47f-4095-b742-dd29244de21c\") " Mar 20 00:11:15 crc kubenswrapper[4867]: I0320 00:11:15.514524 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33637ed0-f47f-4095-b742-dd29244de21c-utilities\") pod \"33637ed0-f47f-4095-b742-dd29244de21c\" (UID: \"33637ed0-f47f-4095-b742-dd29244de21c\") " Mar 20 00:11:15 crc kubenswrapper[4867]: I0320 00:11:15.514601 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33637ed0-f47f-4095-b742-dd29244de21c-catalog-content\") pod \"33637ed0-f47f-4095-b742-dd29244de21c\" (UID: \"33637ed0-f47f-4095-b742-dd29244de21c\") " Mar 20 00:11:15 crc kubenswrapper[4867]: I0320 00:11:15.515386 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33637ed0-f47f-4095-b742-dd29244de21c-utilities" (OuterVolumeSpecName: "utilities") pod "33637ed0-f47f-4095-b742-dd29244de21c" (UID: "33637ed0-f47f-4095-b742-dd29244de21c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:11:15 crc kubenswrapper[4867]: I0320 00:11:15.518991 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33637ed0-f47f-4095-b742-dd29244de21c-kube-api-access-f2qm5" (OuterVolumeSpecName: "kube-api-access-f2qm5") pod "33637ed0-f47f-4095-b742-dd29244de21c" (UID: "33637ed0-f47f-4095-b742-dd29244de21c"). InnerVolumeSpecName "kube-api-access-f2qm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:11:15 crc kubenswrapper[4867]: I0320 00:11:15.537967 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33637ed0-f47f-4095-b742-dd29244de21c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33637ed0-f47f-4095-b742-dd29244de21c" (UID: "33637ed0-f47f-4095-b742-dd29244de21c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:11:15 crc kubenswrapper[4867]: I0320 00:11:15.616441 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2qm5\" (UniqueName: \"kubernetes.io/projected/33637ed0-f47f-4095-b742-dd29244de21c-kube-api-access-f2qm5\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:15 crc kubenswrapper[4867]: I0320 00:11:15.616475 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33637ed0-f47f-4095-b742-dd29244de21c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:15 crc kubenswrapper[4867]: I0320 00:11:15.616485 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33637ed0-f47f-4095-b742-dd29244de21c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:15 crc kubenswrapper[4867]: I0320 00:11:15.951389 4867 generic.go:334] "Generic (PLEG): container finished" podID="33637ed0-f47f-4095-b742-dd29244de21c" containerID="a912da23bceb8a038f0cbb81f14b209028cd57a88e082041f54aa18c64db739d" exitCode=0 Mar 20 00:11:15 crc kubenswrapper[4867]: I0320 00:11:15.951446 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pj4c8" Mar 20 00:11:15 crc kubenswrapper[4867]: I0320 00:11:15.951453 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pj4c8" event={"ID":"33637ed0-f47f-4095-b742-dd29244de21c","Type":"ContainerDied","Data":"a912da23bceb8a038f0cbb81f14b209028cd57a88e082041f54aa18c64db739d"} Mar 20 00:11:15 crc kubenswrapper[4867]: I0320 00:11:15.951479 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pj4c8" event={"ID":"33637ed0-f47f-4095-b742-dd29244de21c","Type":"ContainerDied","Data":"9737c22e87bceaaf7bc4be2d4ecc0f4d49fbde1a735c07724512c79d973815da"} Mar 20 00:11:15 crc kubenswrapper[4867]: I0320 00:11:15.951518 4867 scope.go:117] "RemoveContainer" containerID="a912da23bceb8a038f0cbb81f14b209028cd57a88e082041f54aa18c64db739d" Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.005149 4867 scope.go:117] "RemoveContainer" containerID="7b5d7a679ce794783a7e6523e2c84166dc70131ebf56172c5beef7242b83d5ba" Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.026856 4867 scope.go:117] "RemoveContainer" containerID="28b03311f900c82ca52a4216d30f3e3780e3afa70ae27d4ce2b38cdb3e8b7814" Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.033802 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pj4c8"] Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.038360 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pj4c8"] Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.049314 4867 scope.go:117] "RemoveContainer" containerID="a912da23bceb8a038f0cbb81f14b209028cd57a88e082041f54aa18c64db739d" Mar 20 00:11:16 crc kubenswrapper[4867]: E0320 00:11:16.049771 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a912da23bceb8a038f0cbb81f14b209028cd57a88e082041f54aa18c64db739d\": container with ID starting with a912da23bceb8a038f0cbb81f14b209028cd57a88e082041f54aa18c64db739d not found: ID does not exist" containerID="a912da23bceb8a038f0cbb81f14b209028cd57a88e082041f54aa18c64db739d" Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.049799 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a912da23bceb8a038f0cbb81f14b209028cd57a88e082041f54aa18c64db739d"} err="failed to get container status \"a912da23bceb8a038f0cbb81f14b209028cd57a88e082041f54aa18c64db739d\": rpc error: code = NotFound desc = could not find container \"a912da23bceb8a038f0cbb81f14b209028cd57a88e082041f54aa18c64db739d\": container with ID starting with a912da23bceb8a038f0cbb81f14b209028cd57a88e082041f54aa18c64db739d not found: ID does not exist" Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.049818 4867 scope.go:117] "RemoveContainer" containerID="7b5d7a679ce794783a7e6523e2c84166dc70131ebf56172c5beef7242b83d5ba" Mar 20 00:11:16 crc kubenswrapper[4867]: E0320 00:11:16.050046 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b5d7a679ce794783a7e6523e2c84166dc70131ebf56172c5beef7242b83d5ba\": container with ID starting with 7b5d7a679ce794783a7e6523e2c84166dc70131ebf56172c5beef7242b83d5ba not found: ID does not exist" containerID="7b5d7a679ce794783a7e6523e2c84166dc70131ebf56172c5beef7242b83d5ba" Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.050075 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b5d7a679ce794783a7e6523e2c84166dc70131ebf56172c5beef7242b83d5ba"} err="failed to get container status \"7b5d7a679ce794783a7e6523e2c84166dc70131ebf56172c5beef7242b83d5ba\": rpc error: code = NotFound desc = could not find container \"7b5d7a679ce794783a7e6523e2c84166dc70131ebf56172c5beef7242b83d5ba\": container with ID starting with 7b5d7a679ce794783a7e6523e2c84166dc70131ebf56172c5beef7242b83d5ba not found: ID does not exist" Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.050094 4867 scope.go:117] "RemoveContainer" containerID="28b03311f900c82ca52a4216d30f3e3780e3afa70ae27d4ce2b38cdb3e8b7814" Mar 20 00:11:16 crc kubenswrapper[4867]: E0320 00:11:16.050323 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28b03311f900c82ca52a4216d30f3e3780e3afa70ae27d4ce2b38cdb3e8b7814\": container with ID starting with 28b03311f900c82ca52a4216d30f3e3780e3afa70ae27d4ce2b38cdb3e8b7814 not found: ID does not exist" containerID="28b03311f900c82ca52a4216d30f3e3780e3afa70ae27d4ce2b38cdb3e8b7814" Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.050350 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28b03311f900c82ca52a4216d30f3e3780e3afa70ae27d4ce2b38cdb3e8b7814"} err="failed to get container status \"28b03311f900c82ca52a4216d30f3e3780e3afa70ae27d4ce2b38cdb3e8b7814\": rpc error: code = NotFound desc = could not find container \"28b03311f900c82ca52a4216d30f3e3780e3afa70ae27d4ce2b38cdb3e8b7814\": container with ID starting with 28b03311f900c82ca52a4216d30f3e3780e3afa70ae27d4ce2b38cdb3e8b7814 not found: ID does not exist" Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.430379 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33637ed0-f47f-4095-b742-dd29244de21c" path="/var/lib/kubelet/pods/33637ed0-f47f-4095-b742-dd29244de21c/volumes" Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.431665 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a24a1654-a84e-4b71-9209-dfd7e42941d0" path="/var/lib/kubelet/pods/a24a1654-a84e-4b71-9209-dfd7e42941d0/volumes" Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.453378 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-992sr"] Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.453597 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-992sr" podUID="e04729fe-8c2e-4265-aa19-5717138937bd" containerName="registry-server" containerID="cri-o://72a72e438bc28d5166d97520e97f147ed03fdf413f3d0e7bc88e780e18e186b4" gracePeriod=2 Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.919888 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-992sr" Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.973959 4867 generic.go:334] "Generic (PLEG): container finished" podID="e04729fe-8c2e-4265-aa19-5717138937bd" containerID="72a72e438bc28d5166d97520e97f147ed03fdf413f3d0e7bc88e780e18e186b4" exitCode=0 Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.974014 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-992sr" event={"ID":"e04729fe-8c2e-4265-aa19-5717138937bd","Type":"ContainerDied","Data":"72a72e438bc28d5166d97520e97f147ed03fdf413f3d0e7bc88e780e18e186b4"} Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.974039 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-992sr" event={"ID":"e04729fe-8c2e-4265-aa19-5717138937bd","Type":"ContainerDied","Data":"bdbaa04290888a18d861cc38bd2fcb59fdb2b0eaf117dedfdf44a6c739befed9"} Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.974057 4867 scope.go:117] "RemoveContainer" containerID="72a72e438bc28d5166d97520e97f147ed03fdf413f3d0e7bc88e780e18e186b4" Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.974050 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-992sr" Mar 20 00:11:16 crc kubenswrapper[4867]: I0320 00:11:16.976930 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsj5n" event={"ID":"61384875-b5b9-4757-839c-2071e973510c","Type":"ContainerStarted","Data":"8d8cea14d46a5de6ea7826a31b0f9da5bef51ba9da2ee4c4413a1619271fa6e1"} Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.000636 4867 scope.go:117] "RemoveContainer" containerID="5c53ffae0764e9734e2d38babe9264fd112089898da8cf78c3a6cd25cbdc9622" Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.012752 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fsj5n" podStartSLOduration=3.223208666 podStartE2EDuration="1m8.012735654s" podCreationTimestamp="2026-03-20 00:10:09 +0000 UTC" firstStartedPulling="2026-03-20 00:10:11.117128493 +0000 UTC m=+225.343666000" lastFinishedPulling="2026-03-20 00:11:15.906655461 +0000 UTC m=+290.133192988" observedRunningTime="2026-03-20 00:11:17.008033693 +0000 UTC m=+291.234571210" watchObservedRunningTime="2026-03-20 00:11:17.012735654 +0000 UTC m=+291.239273171" Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.023422 4867 scope.go:117] "RemoveContainer" containerID="56eb3b7da1c9ec6e698bf0ccfbee9f60a2fca29c46c68d5aea03de29e5877249" Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.038447 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04729fe-8c2e-4265-aa19-5717138937bd-utilities\") pod \"e04729fe-8c2e-4265-aa19-5717138937bd\" (UID: \"e04729fe-8c2e-4265-aa19-5717138937bd\") " Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.038588 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r9hf\" (UniqueName: \"kubernetes.io/projected/e04729fe-8c2e-4265-aa19-5717138937bd-kube-api-access-6r9hf\") pod \"e04729fe-8c2e-4265-aa19-5717138937bd\" (UID: \"e04729fe-8c2e-4265-aa19-5717138937bd\") " Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.038659 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04729fe-8c2e-4265-aa19-5717138937bd-catalog-content\") pod \"e04729fe-8c2e-4265-aa19-5717138937bd\" (UID: \"e04729fe-8c2e-4265-aa19-5717138937bd\") " Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.039437 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e04729fe-8c2e-4265-aa19-5717138937bd-utilities" (OuterVolumeSpecName: "utilities") pod "e04729fe-8c2e-4265-aa19-5717138937bd" (UID: "e04729fe-8c2e-4265-aa19-5717138937bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.039603 4867 scope.go:117] "RemoveContainer" containerID="72a72e438bc28d5166d97520e97f147ed03fdf413f3d0e7bc88e780e18e186b4" Mar 20 00:11:17 crc kubenswrapper[4867]: E0320 00:11:17.040001 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72a72e438bc28d5166d97520e97f147ed03fdf413f3d0e7bc88e780e18e186b4\": container with ID starting with 72a72e438bc28d5166d97520e97f147ed03fdf413f3d0e7bc88e780e18e186b4 not found: ID does not exist" containerID="72a72e438bc28d5166d97520e97f147ed03fdf413f3d0e7bc88e780e18e186b4" Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.040052 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72a72e438bc28d5166d97520e97f147ed03fdf413f3d0e7bc88e780e18e186b4"} err="failed to get container status \"72a72e438bc28d5166d97520e97f147ed03fdf413f3d0e7bc88e780e18e186b4\": rpc error: code = NotFound desc = could not find container \"72a72e438bc28d5166d97520e97f147ed03fdf413f3d0e7bc88e780e18e186b4\": container with ID starting with 72a72e438bc28d5166d97520e97f147ed03fdf413f3d0e7bc88e780e18e186b4 not found: ID does not exist" Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.040085 4867 scope.go:117] "RemoveContainer" containerID="5c53ffae0764e9734e2d38babe9264fd112089898da8cf78c3a6cd25cbdc9622" Mar 20 00:11:17 crc kubenswrapper[4867]: E0320 00:11:17.040346 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c53ffae0764e9734e2d38babe9264fd112089898da8cf78c3a6cd25cbdc9622\": container with ID starting with 5c53ffae0764e9734e2d38babe9264fd112089898da8cf78c3a6cd25cbdc9622 not found: ID does not exist" containerID="5c53ffae0764e9734e2d38babe9264fd112089898da8cf78c3a6cd25cbdc9622" Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.040375 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c53ffae0764e9734e2d38babe9264fd112089898da8cf78c3a6cd25cbdc9622"} err="failed to get container status \"5c53ffae0764e9734e2d38babe9264fd112089898da8cf78c3a6cd25cbdc9622\": rpc error: code = NotFound desc = could not find container \"5c53ffae0764e9734e2d38babe9264fd112089898da8cf78c3a6cd25cbdc9622\": container with ID starting with 5c53ffae0764e9734e2d38babe9264fd112089898da8cf78c3a6cd25cbdc9622 not found: ID does not exist" Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.040395 4867 scope.go:117] "RemoveContainer" containerID="56eb3b7da1c9ec6e698bf0ccfbee9f60a2fca29c46c68d5aea03de29e5877249" Mar 20 00:11:17 crc kubenswrapper[4867]: E0320 00:11:17.040672 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56eb3b7da1c9ec6e698bf0ccfbee9f60a2fca29c46c68d5aea03de29e5877249\": container with ID starting with 56eb3b7da1c9ec6e698bf0ccfbee9f60a2fca29c46c68d5aea03de29e5877249 not found: ID does not exist" containerID="56eb3b7da1c9ec6e698bf0ccfbee9f60a2fca29c46c68d5aea03de29e5877249" Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.040704 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56eb3b7da1c9ec6e698bf0ccfbee9f60a2fca29c46c68d5aea03de29e5877249"} err="failed to get container status \"56eb3b7da1c9ec6e698bf0ccfbee9f60a2fca29c46c68d5aea03de29e5877249\": rpc error: code = NotFound desc = could not find container \"56eb3b7da1c9ec6e698bf0ccfbee9f60a2fca29c46c68d5aea03de29e5877249\": container with ID starting with 56eb3b7da1c9ec6e698bf0ccfbee9f60a2fca29c46c68d5aea03de29e5877249 not found: ID does not exist" Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.046641 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e04729fe-8c2e-4265-aa19-5717138937bd-kube-api-access-6r9hf" (OuterVolumeSpecName: "kube-api-access-6r9hf") pod "e04729fe-8c2e-4265-aa19-5717138937bd" (UID: "e04729fe-8c2e-4265-aa19-5717138937bd"). InnerVolumeSpecName "kube-api-access-6r9hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.140527 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r9hf\" (UniqueName: \"kubernetes.io/projected/e04729fe-8c2e-4265-aa19-5717138937bd-kube-api-access-6r9hf\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.140555 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04729fe-8c2e-4265-aa19-5717138937bd-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.195447 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e04729fe-8c2e-4265-aa19-5717138937bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e04729fe-8c2e-4265-aa19-5717138937bd" (UID: "e04729fe-8c2e-4265-aa19-5717138937bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.242063 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04729fe-8c2e-4265-aa19-5717138937bd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.302538 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-992sr"] Mar 20 00:11:17 crc kubenswrapper[4867]: I0320 00:11:17.302585 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-992sr"] Mar 20 00:11:18 crc kubenswrapper[4867]: I0320 00:11:18.430461 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e04729fe-8c2e-4265-aa19-5717138937bd" path="/var/lib/kubelet/pods/e04729fe-8c2e-4265-aa19-5717138937bd/volumes" Mar 20 00:11:18 crc kubenswrapper[4867]: I0320 00:11:18.860668 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:11:18 crc kubenswrapper[4867]: I0320 00:11:18.861024 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:11:18 crc kubenswrapper[4867]: I0320 00:11:18.861094 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:11:18 crc kubenswrapper[4867]: I0320 00:11:18.861897 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee"} pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 00:11:18 crc kubenswrapper[4867]: I0320 00:11:18.861997 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" containerID="cri-o://d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee" gracePeriod=600 Mar 20 00:11:19 crc kubenswrapper[4867]: I0320 00:11:19.356097 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:19 crc kubenswrapper[4867]: I0320 00:11:19.363429 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:19 crc kubenswrapper[4867]: I0320 00:11:19.538390 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fsj5n" Mar 20 00:11:19 crc kubenswrapper[4867]: I0320 00:11:19.538656 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fsj5n" Mar 20 00:11:19 crc kubenswrapper[4867]: I0320 00:11:19.573754 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fsj5n" Mar 20 00:11:19 crc kubenswrapper[4867]: I0320 00:11:19.998240 4867 generic.go:334] "Generic (PLEG): container finished" podID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerID="d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee" exitCode=0 Mar 20 00:11:19 crc kubenswrapper[4867]: I0320 00:11:19.998371 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" event={"ID":"00eacbd3-d921-414b-8b8d-c4298bdd5a28","Type":"ContainerDied","Data":"d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee"} Mar 20 00:11:19 crc kubenswrapper[4867]: I0320 00:11:19.998431 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" event={"ID":"00eacbd3-d921-414b-8b8d-c4298bdd5a28","Type":"ContainerStarted","Data":"e816a18b900f4eb68c97f4f5041f2cb93edb33be685b6595c99943bb25a6e738"} Mar 20 00:11:21 crc kubenswrapper[4867]: I0320 00:11:21.041866 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fsj5n" Mar 20 00:11:25 crc kubenswrapper[4867]: I0320 00:11:25.943413 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c8fcf5668-8cxsr"] Mar 20 00:11:25 crc kubenswrapper[4867]: I0320 00:11:25.944025 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" podUID="12cd4758-795a-431c-ab92-b3156cfeb6e7" containerName="controller-manager" containerID="cri-o://f65dc7b68a5c7a7edd2dd8789d7981492b8766bf41d00ae3a64705f3b5fe4436" gracePeriod=30 Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.036773 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8"] Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.037244 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" podUID="21b06883-a7a1-473b-bb3b-e8651ce58fae" containerName="route-controller-manager" containerID="cri-o://f39eef7722624e1cba330d288d1550e002097dba1c97096547821b468472fa08" gracePeriod=30 Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.329545 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" podUID="fb1e4d21-3692-4e9d-914c-12cd30f641fa" containerName="oauth-openshift" containerID="cri-o://647c5eeecb834a87f127ff427d67b81a9a3fd038c51fa522e38dae78f0e95191" gracePeriod=15 Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.373636 4867 generic.go:334] "Generic (PLEG): container finished" podID="21b06883-a7a1-473b-bb3b-e8651ce58fae" containerID="f39eef7722624e1cba330d288d1550e002097dba1c97096547821b468472fa08" exitCode=0 Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.373703 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" event={"ID":"21b06883-a7a1-473b-bb3b-e8651ce58fae","Type":"ContainerDied","Data":"f39eef7722624e1cba330d288d1550e002097dba1c97096547821b468472fa08"} Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.374566 4867 generic.go:334] "Generic (PLEG): container finished" podID="12cd4758-795a-431c-ab92-b3156cfeb6e7" containerID="f65dc7b68a5c7a7edd2dd8789d7981492b8766bf41d00ae3a64705f3b5fe4436" exitCode=0 Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.374592 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" event={"ID":"12cd4758-795a-431c-ab92-b3156cfeb6e7","Type":"ContainerDied","Data":"f65dc7b68a5c7a7edd2dd8789d7981492b8766bf41d00ae3a64705f3b5fe4436"} Mar 20 00:11:26 crc kubenswrapper[4867]: E0320 00:11:26.423612 4867 info.go:109] Failed to get network devices: open /sys/class/net/15037152aaa585d/address: no such file or directory Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.562379 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.585109 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b06883-a7a1-473b-bb3b-e8651ce58fae-config\") pod \"21b06883-a7a1-473b-bb3b-e8651ce58fae\" (UID: \"21b06883-a7a1-473b-bb3b-e8651ce58fae\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.585168 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21b06883-a7a1-473b-bb3b-e8651ce58fae-client-ca\") pod \"21b06883-a7a1-473b-bb3b-e8651ce58fae\" (UID: \"21b06883-a7a1-473b-bb3b-e8651ce58fae\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.585275 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21b06883-a7a1-473b-bb3b-e8651ce58fae-serving-cert\") pod \"21b06883-a7a1-473b-bb3b-e8651ce58fae\" (UID: \"21b06883-a7a1-473b-bb3b-e8651ce58fae\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.585313 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8l88\" (UniqueName: \"kubernetes.io/projected/21b06883-a7a1-473b-bb3b-e8651ce58fae-kube-api-access-f8l88\") pod \"21b06883-a7a1-473b-bb3b-e8651ce58fae\" (UID: \"21b06883-a7a1-473b-bb3b-e8651ce58fae\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.586480 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b06883-a7a1-473b-bb3b-e8651ce58fae-client-ca" (OuterVolumeSpecName: "client-ca") pod "21b06883-a7a1-473b-bb3b-e8651ce58fae" (UID: "21b06883-a7a1-473b-bb3b-e8651ce58fae"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.586920 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b06883-a7a1-473b-bb3b-e8651ce58fae-config" (OuterVolumeSpecName: "config") pod "21b06883-a7a1-473b-bb3b-e8651ce58fae" (UID: "21b06883-a7a1-473b-bb3b-e8651ce58fae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.591239 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b06883-a7a1-473b-bb3b-e8651ce58fae-kube-api-access-f8l88" (OuterVolumeSpecName: "kube-api-access-f8l88") pod "21b06883-a7a1-473b-bb3b-e8651ce58fae" (UID: "21b06883-a7a1-473b-bb3b-e8651ce58fae"). InnerVolumeSpecName "kube-api-access-f8l88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.592081 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21b06883-a7a1-473b-bb3b-e8651ce58fae-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "21b06883-a7a1-473b-bb3b-e8651ce58fae" (UID: "21b06883-a7a1-473b-bb3b-e8651ce58fae"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.620257 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.686847 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12cd4758-795a-431c-ab92-b3156cfeb6e7-serving-cert\") pod \"12cd4758-795a-431c-ab92-b3156cfeb6e7\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.686913 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12cd4758-795a-431c-ab92-b3156cfeb6e7-proxy-ca-bundles\") pod \"12cd4758-795a-431c-ab92-b3156cfeb6e7\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.686959 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12cd4758-795a-431c-ab92-b3156cfeb6e7-client-ca\") pod \"12cd4758-795a-431c-ab92-b3156cfeb6e7\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.686993 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4s4x\" (UniqueName: \"kubernetes.io/projected/12cd4758-795a-431c-ab92-b3156cfeb6e7-kube-api-access-z4s4x\") pod \"12cd4758-795a-431c-ab92-b3156cfeb6e7\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.687048 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12cd4758-795a-431c-ab92-b3156cfeb6e7-config\") pod \"12cd4758-795a-431c-ab92-b3156cfeb6e7\" (UID: \"12cd4758-795a-431c-ab92-b3156cfeb6e7\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.687331 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8l88\" (UniqueName: \"kubernetes.io/projected/21b06883-a7a1-473b-bb3b-e8651ce58fae-kube-api-access-f8l88\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.687354 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b06883-a7a1-473b-bb3b-e8651ce58fae-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.687383 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21b06883-a7a1-473b-bb3b-e8651ce58fae-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.687402 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21b06883-a7a1-473b-bb3b-e8651ce58fae-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.687722 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12cd4758-795a-431c-ab92-b3156cfeb6e7-client-ca" (OuterVolumeSpecName: "client-ca") pod "12cd4758-795a-431c-ab92-b3156cfeb6e7" (UID: "12cd4758-795a-431c-ab92-b3156cfeb6e7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.687744 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12cd4758-795a-431c-ab92-b3156cfeb6e7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "12cd4758-795a-431c-ab92-b3156cfeb6e7" (UID: "12cd4758-795a-431c-ab92-b3156cfeb6e7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.695911 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12cd4758-795a-431c-ab92-b3156cfeb6e7-config" (OuterVolumeSpecName: "config") pod "12cd4758-795a-431c-ab92-b3156cfeb6e7" (UID: "12cd4758-795a-431c-ab92-b3156cfeb6e7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.700039 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12cd4758-795a-431c-ab92-b3156cfeb6e7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "12cd4758-795a-431c-ab92-b3156cfeb6e7" (UID: "12cd4758-795a-431c-ab92-b3156cfeb6e7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.706688 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12cd4758-795a-431c-ab92-b3156cfeb6e7-kube-api-access-z4s4x" (OuterVolumeSpecName: "kube-api-access-z4s4x") pod "12cd4758-795a-431c-ab92-b3156cfeb6e7" (UID: "12cd4758-795a-431c-ab92-b3156cfeb6e7"). InnerVolumeSpecName "kube-api-access-z4s4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.782426 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.788363 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12cd4758-795a-431c-ab92-b3156cfeb6e7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.788395 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12cd4758-795a-431c-ab92-b3156cfeb6e7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.788409 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12cd4758-795a-431c-ab92-b3156cfeb6e7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.788421 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4s4x\" (UniqueName: \"kubernetes.io/projected/12cd4758-795a-431c-ab92-b3156cfeb6e7-kube-api-access-z4s4x\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.788434 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12cd4758-795a-431c-ab92-b3156cfeb6e7-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.889479 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-idp-0-file-data\") pod \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.889532 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb1e4d21-3692-4e9d-914c-12cd30f641fa-audit-dir\") pod \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.889557 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-serving-cert\") pod \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.889586 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-audit-policies\") pod \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.889614 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dcsh\" (UniqueName: \"kubernetes.io/projected/fb1e4d21-3692-4e9d-914c-12cd30f641fa-kube-api-access-2dcsh\") pod \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.889633 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-template-login\") pod \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.889652 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-trusted-ca-bundle\") pod \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.889672 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-session\") pod \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.889692 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-template-provider-selection\") pod \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.889725 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-template-error\") pod \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.889751 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-ocp-branding-template\") pod \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.889768 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-service-ca\") pod \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.889784 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-router-certs\") pod \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.889800 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-cliconfig\") pod \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\" (UID: \"fb1e4d21-3692-4e9d-914c-12cd30f641fa\") " Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.890272 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb1e4d21-3692-4e9d-914c-12cd30f641fa-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "fb1e4d21-3692-4e9d-914c-12cd30f641fa" (UID: "fb1e4d21-3692-4e9d-914c-12cd30f641fa"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.890379 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "fb1e4d21-3692-4e9d-914c-12cd30f641fa" (UID: "fb1e4d21-3692-4e9d-914c-12cd30f641fa"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.890397 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "fb1e4d21-3692-4e9d-914c-12cd30f641fa" (UID: "fb1e4d21-3692-4e9d-914c-12cd30f641fa"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.891407 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "fb1e4d21-3692-4e9d-914c-12cd30f641fa" (UID: "fb1e4d21-3692-4e9d-914c-12cd30f641fa"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.892883 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "fb1e4d21-3692-4e9d-914c-12cd30f641fa" (UID: "fb1e4d21-3692-4e9d-914c-12cd30f641fa"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.893664 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "fb1e4d21-3692-4e9d-914c-12cd30f641fa" (UID: "fb1e4d21-3692-4e9d-914c-12cd30f641fa"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.893680 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "fb1e4d21-3692-4e9d-914c-12cd30f641fa" (UID: "fb1e4d21-3692-4e9d-914c-12cd30f641fa"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.893617 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "fb1e4d21-3692-4e9d-914c-12cd30f641fa" (UID: "fb1e4d21-3692-4e9d-914c-12cd30f641fa"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.893829 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "fb1e4d21-3692-4e9d-914c-12cd30f641fa" (UID: "fb1e4d21-3692-4e9d-914c-12cd30f641fa"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.894020 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "fb1e4d21-3692-4e9d-914c-12cd30f641fa" (UID: "fb1e4d21-3692-4e9d-914c-12cd30f641fa"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.894344 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "fb1e4d21-3692-4e9d-914c-12cd30f641fa" (UID: "fb1e4d21-3692-4e9d-914c-12cd30f641fa"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.894542 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb1e4d21-3692-4e9d-914c-12cd30f641fa-kube-api-access-2dcsh" (OuterVolumeSpecName: "kube-api-access-2dcsh") pod "fb1e4d21-3692-4e9d-914c-12cd30f641fa" (UID: "fb1e4d21-3692-4e9d-914c-12cd30f641fa"). InnerVolumeSpecName "kube-api-access-2dcsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.895873 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "fb1e4d21-3692-4e9d-914c-12cd30f641fa" (UID: "fb1e4d21-3692-4e9d-914c-12cd30f641fa"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.897147 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "fb1e4d21-3692-4e9d-914c-12cd30f641fa" (UID: "fb1e4d21-3692-4e9d-914c-12cd30f641fa"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.991324 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.991365 4867 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb1e4d21-3692-4e9d-914c-12cd30f641fa-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.991378 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.991387 4867 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.991397 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dcsh\" (UniqueName: \"kubernetes.io/projected/fb1e4d21-3692-4e9d-914c-12cd30f641fa-kube-api-access-2dcsh\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.991405 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.991415 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.991423 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.991432 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.991444 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.991453 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.991461 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.991469 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:26 crc kubenswrapper[4867]: I0320 00:11:26.991477 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb1e4d21-3692-4e9d-914c-12cd30f641fa-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.381535 4867 generic.go:334] "Generic (PLEG): container finished" podID="fb1e4d21-3692-4e9d-914c-12cd30f641fa" containerID="647c5eeecb834a87f127ff427d67b81a9a3fd038c51fa522e38dae78f0e95191" exitCode=0 Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.381572 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.381626 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" event={"ID":"fb1e4d21-3692-4e9d-914c-12cd30f641fa","Type":"ContainerDied","Data":"647c5eeecb834a87f127ff427d67b81a9a3fd038c51fa522e38dae78f0e95191"} Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.381667 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bwpfb" event={"ID":"fb1e4d21-3692-4e9d-914c-12cd30f641fa","Type":"ContainerDied","Data":"01efea2c9a61968229cf21c8c819f3abb71fe1ea8addd4b2a0ac127e375cabb6"} Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.381689 4867 scope.go:117] "RemoveContainer" containerID="647c5eeecb834a87f127ff427d67b81a9a3fd038c51fa522e38dae78f0e95191" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.383258 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" event={"ID":"21b06883-a7a1-473b-bb3b-e8651ce58fae","Type":"ContainerDied","Data":"15037152aaa585d21793dde83eed89f5a170e4917ccccde1faf853c8deb44998"} Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.383279 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.384657 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" event={"ID":"12cd4758-795a-431c-ab92-b3156cfeb6e7","Type":"ContainerDied","Data":"7458a5605845b8a95f6c6cf134a78f80051d036a4174dd5fd0b29d1ad54ca2d5"} Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.384697 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c8fcf5668-8cxsr" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.404189 4867 scope.go:117] "RemoveContainer" containerID="647c5eeecb834a87f127ff427d67b81a9a3fd038c51fa522e38dae78f0e95191" Mar 20 00:11:27 crc kubenswrapper[4867]: E0320 00:11:27.406272 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"647c5eeecb834a87f127ff427d67b81a9a3fd038c51fa522e38dae78f0e95191\": container with ID starting with 647c5eeecb834a87f127ff427d67b81a9a3fd038c51fa522e38dae78f0e95191 not found: ID does not exist" containerID="647c5eeecb834a87f127ff427d67b81a9a3fd038c51fa522e38dae78f0e95191" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.406312 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"647c5eeecb834a87f127ff427d67b81a9a3fd038c51fa522e38dae78f0e95191"} err="failed to get container status \"647c5eeecb834a87f127ff427d67b81a9a3fd038c51fa522e38dae78f0e95191\": rpc error: code = NotFound desc = could not find container \"647c5eeecb834a87f127ff427d67b81a9a3fd038c51fa522e38dae78f0e95191\": container with ID starting with 647c5eeecb834a87f127ff427d67b81a9a3fd038c51fa522e38dae78f0e95191 not found: ID does not exist" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.406338 4867 scope.go:117] "RemoveContainer" containerID="f39eef7722624e1cba330d288d1550e002097dba1c97096547821b468472fa08" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.413193 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8"] Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.416764 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64f4c9dfb4-jnhm8"] Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.419382 4867 scope.go:117] "RemoveContainer" containerID="f65dc7b68a5c7a7edd2dd8789d7981492b8766bf41d00ae3a64705f3b5fe4436" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.430638 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c8fcf5668-8cxsr"] Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.437515 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-c8fcf5668-8cxsr"] Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.447690 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bwpfb"] Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.451713 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bwpfb"] Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573230 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj"] Mar 20 00:11:27 crc kubenswrapper[4867]: E0320 00:11:27.573462 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e04729fe-8c2e-4265-aa19-5717138937bd" containerName="extract-content" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573514 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e04729fe-8c2e-4265-aa19-5717138937bd" containerName="extract-content" Mar 20 00:11:27 crc kubenswrapper[4867]: E0320 00:11:27.573526 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f94e88-d9e0-41a6-9dfe-390f2d709596" containerName="registry-server" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573536 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f94e88-d9e0-41a6-9dfe-390f2d709596" containerName="registry-server" Mar 20 00:11:27 crc kubenswrapper[4867]: E0320 00:11:27.573552 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24a1654-a84e-4b71-9209-dfd7e42941d0" containerName="extract-utilities" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573562 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24a1654-a84e-4b71-9209-dfd7e42941d0" containerName="extract-utilities" Mar 20 00:11:27 crc kubenswrapper[4867]: E0320 00:11:27.573576 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f94e88-d9e0-41a6-9dfe-390f2d709596" containerName="extract-utilities" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573586 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f94e88-d9e0-41a6-9dfe-390f2d709596" containerName="extract-utilities" Mar 20 00:11:27 crc kubenswrapper[4867]: E0320 00:11:27.573597 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24a1654-a84e-4b71-9209-dfd7e42941d0" containerName="extract-content" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573605 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24a1654-a84e-4b71-9209-dfd7e42941d0" containerName="extract-content" Mar 20 00:11:27 crc kubenswrapper[4867]: E0320 00:11:27.573614 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b06883-a7a1-473b-bb3b-e8651ce58fae" containerName="route-controller-manager" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573622 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b06883-a7a1-473b-bb3b-e8651ce58fae" containerName="route-controller-manager" Mar 20 00:11:27 crc kubenswrapper[4867]: E0320 00:11:27.573633 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33637ed0-f47f-4095-b742-dd29244de21c" containerName="extract-utilities" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573640 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="33637ed0-f47f-4095-b742-dd29244de21c" containerName="extract-utilities" Mar 20 00:11:27 crc kubenswrapper[4867]: E0320 00:11:27.573651 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1e4d21-3692-4e9d-914c-12cd30f641fa" containerName="oauth-openshift" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573658 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1e4d21-3692-4e9d-914c-12cd30f641fa" containerName="oauth-openshift" Mar 20 00:11:27 crc kubenswrapper[4867]: E0320 00:11:27.573669 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e04729fe-8c2e-4265-aa19-5717138937bd" containerName="extract-utilities" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573676 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e04729fe-8c2e-4265-aa19-5717138937bd" containerName="extract-utilities" Mar 20 00:11:27 crc kubenswrapper[4867]: E0320 00:11:27.573684 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33637ed0-f47f-4095-b742-dd29244de21c" containerName="registry-server" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573691 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="33637ed0-f47f-4095-b742-dd29244de21c" containerName="registry-server" Mar 20 00:11:27 crc kubenswrapper[4867]: E0320 00:11:27.573704 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33637ed0-f47f-4095-b742-dd29244de21c" containerName="extract-content" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573713 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="33637ed0-f47f-4095-b742-dd29244de21c" containerName="extract-content" Mar 20 00:11:27 crc kubenswrapper[4867]: E0320 00:11:27.573722 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12cd4758-795a-431c-ab92-b3156cfeb6e7" containerName="controller-manager" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573729 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="12cd4758-795a-431c-ab92-b3156cfeb6e7" containerName="controller-manager" Mar 20 00:11:27 crc kubenswrapper[4867]: E0320 00:11:27.573739 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24a1654-a84e-4b71-9209-dfd7e42941d0" containerName="registry-server" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573747 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24a1654-a84e-4b71-9209-dfd7e42941d0" containerName="registry-server" Mar 20 00:11:27 crc kubenswrapper[4867]: E0320 00:11:27.573757 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f94e88-d9e0-41a6-9dfe-390f2d709596" containerName="extract-content" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573764 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f94e88-d9e0-41a6-9dfe-390f2d709596" containerName="extract-content" Mar 20 00:11:27 crc kubenswrapper[4867]: E0320 00:11:27.573774 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e04729fe-8c2e-4265-aa19-5717138937bd" containerName="registry-server" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573781 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e04729fe-8c2e-4265-aa19-5717138937bd" containerName="registry-server" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573887 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b06883-a7a1-473b-bb3b-e8651ce58fae" containerName="route-controller-manager" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573898 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="33637ed0-f47f-4095-b742-dd29244de21c" containerName="registry-server" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573910 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="12cd4758-795a-431c-ab92-b3156cfeb6e7" containerName="controller-manager" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573924 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f94e88-d9e0-41a6-9dfe-390f2d709596" containerName="registry-server" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573932 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb1e4d21-3692-4e9d-914c-12cd30f641fa" containerName="oauth-openshift" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573942 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e04729fe-8c2e-4265-aa19-5717138937bd" containerName="registry-server" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.573951 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a24a1654-a84e-4b71-9209-dfd7e42941d0" containerName="registry-server" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.574452 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.582866 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.583050 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.583086 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.583160 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.583556 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.586870 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.588303 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm"] Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.588909 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.591773 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.591884 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.591948 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.592099 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.592123 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.592233 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.592345 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.597735 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/296034a6-495a-44e7-bbec-41563a313938-serving-cert\") pod \"controller-manager-68d7f54cd4-h6jsj\" (UID: \"296034a6-495a-44e7-bbec-41563a313938\") " pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.597789 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/296034a6-495a-44e7-bbec-41563a313938-proxy-ca-bundles\") pod \"controller-manager-68d7f54cd4-h6jsj\" (UID: \"296034a6-495a-44e7-bbec-41563a313938\") " pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.597809 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlpws\" (UniqueName: \"kubernetes.io/projected/296034a6-495a-44e7-bbec-41563a313938-kube-api-access-jlpws\") pod \"controller-manager-68d7f54cd4-h6jsj\" (UID: \"296034a6-495a-44e7-bbec-41563a313938\") " pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.597835 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/296034a6-495a-44e7-bbec-41563a313938-client-ca\") pod \"controller-manager-68d7f54cd4-h6jsj\" (UID: \"296034a6-495a-44e7-bbec-41563a313938\") " pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.597857 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/296034a6-495a-44e7-bbec-41563a313938-config\") pod \"controller-manager-68d7f54cd4-h6jsj\" (UID: \"296034a6-495a-44e7-bbec-41563a313938\") " pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.613588 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj"] Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.618250 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm"] Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.699020 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09c3c529-9d16-4205-bf3f-6d50c7216684-serving-cert\") pod \"route-controller-manager-555c8b49bd-7swdm\" (UID: \"09c3c529-9d16-4205-bf3f-6d50c7216684\") " pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.699076 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/296034a6-495a-44e7-bbec-41563a313938-serving-cert\") pod \"controller-manager-68d7f54cd4-h6jsj\" (UID: \"296034a6-495a-44e7-bbec-41563a313938\") " pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.699118 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/296034a6-495a-44e7-bbec-41563a313938-proxy-ca-bundles\") pod \"controller-manager-68d7f54cd4-h6jsj\" (UID: \"296034a6-495a-44e7-bbec-41563a313938\") " pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.699136 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlpws\" (UniqueName: \"kubernetes.io/projected/296034a6-495a-44e7-bbec-41563a313938-kube-api-access-jlpws\") pod \"controller-manager-68d7f54cd4-h6jsj\" (UID: \"296034a6-495a-44e7-bbec-41563a313938\") " pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.699159 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09c3c529-9d16-4205-bf3f-6d50c7216684-config\") pod \"route-controller-manager-555c8b49bd-7swdm\" (UID: \"09c3c529-9d16-4205-bf3f-6d50c7216684\") " pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.699181 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/296034a6-495a-44e7-bbec-41563a313938-client-ca\") pod \"controller-manager-68d7f54cd4-h6jsj\" (UID: \"296034a6-495a-44e7-bbec-41563a313938\") " pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.699203 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/296034a6-495a-44e7-bbec-41563a313938-config\") pod \"controller-manager-68d7f54cd4-h6jsj\" (UID: \"296034a6-495a-44e7-bbec-41563a313938\") " pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.699224 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfmdr\" (UniqueName: \"kubernetes.io/projected/09c3c529-9d16-4205-bf3f-6d50c7216684-kube-api-access-tfmdr\") pod \"route-controller-manager-555c8b49bd-7swdm\" (UID: \"09c3c529-9d16-4205-bf3f-6d50c7216684\") " pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.699244 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09c3c529-9d16-4205-bf3f-6d50c7216684-client-ca\") pod \"route-controller-manager-555c8b49bd-7swdm\" (UID: \"09c3c529-9d16-4205-bf3f-6d50c7216684\") " pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.700816 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/296034a6-495a-44e7-bbec-41563a313938-client-ca\") pod \"controller-manager-68d7f54cd4-h6jsj\" (UID: \"296034a6-495a-44e7-bbec-41563a313938\") " pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.700968 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/296034a6-495a-44e7-bbec-41563a313938-proxy-ca-bundles\") pod \"controller-manager-68d7f54cd4-h6jsj\" (UID: \"296034a6-495a-44e7-bbec-41563a313938\") " pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.701114 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/296034a6-495a-44e7-bbec-41563a313938-config\") pod \"controller-manager-68d7f54cd4-h6jsj\" (UID: \"296034a6-495a-44e7-bbec-41563a313938\") " pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.703203 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/296034a6-495a-44e7-bbec-41563a313938-serving-cert\") pod \"controller-manager-68d7f54cd4-h6jsj\" (UID: \"296034a6-495a-44e7-bbec-41563a313938\") " pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.733226 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlpws\" (UniqueName: \"kubernetes.io/projected/296034a6-495a-44e7-bbec-41563a313938-kube-api-access-jlpws\") pod \"controller-manager-68d7f54cd4-h6jsj\" (UID: \"296034a6-495a-44e7-bbec-41563a313938\") " pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.799812 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09c3c529-9d16-4205-bf3f-6d50c7216684-config\") pod \"route-controller-manager-555c8b49bd-7swdm\" (UID: \"09c3c529-9d16-4205-bf3f-6d50c7216684\") " pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.799873 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfmdr\" (UniqueName: \"kubernetes.io/projected/09c3c529-9d16-4205-bf3f-6d50c7216684-kube-api-access-tfmdr\") pod \"route-controller-manager-555c8b49bd-7swdm\" (UID: \"09c3c529-9d16-4205-bf3f-6d50c7216684\") " pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.799900 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09c3c529-9d16-4205-bf3f-6d50c7216684-client-ca\") pod \"route-controller-manager-555c8b49bd-7swdm\" (UID: \"09c3c529-9d16-4205-bf3f-6d50c7216684\") " pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.799933 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09c3c529-9d16-4205-bf3f-6d50c7216684-serving-cert\") pod \"route-controller-manager-555c8b49bd-7swdm\" (UID: \"09c3c529-9d16-4205-bf3f-6d50c7216684\") " pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.801480 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09c3c529-9d16-4205-bf3f-6d50c7216684-config\") pod \"route-controller-manager-555c8b49bd-7swdm\" (UID: \"09c3c529-9d16-4205-bf3f-6d50c7216684\") " pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.802028 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09c3c529-9d16-4205-bf3f-6d50c7216684-client-ca\") pod \"route-controller-manager-555c8b49bd-7swdm\" (UID: \"09c3c529-9d16-4205-bf3f-6d50c7216684\") " pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.805219 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09c3c529-9d16-4205-bf3f-6d50c7216684-serving-cert\") pod \"route-controller-manager-555c8b49bd-7swdm\" (UID: \"09c3c529-9d16-4205-bf3f-6d50c7216684\") " pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.818203 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfmdr\" (UniqueName: \"kubernetes.io/projected/09c3c529-9d16-4205-bf3f-6d50c7216684-kube-api-access-tfmdr\") pod \"route-controller-manager-555c8b49bd-7swdm\" (UID: \"09c3c529-9d16-4205-bf3f-6d50c7216684\") " pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.887796 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:27 crc kubenswrapper[4867]: I0320 00:11:27.899458 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.269031 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj"] Mar 20 00:11:28 crc kubenswrapper[4867]: W0320 00:11:28.283623 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod296034a6_495a_44e7_bbec_41563a313938.slice/crio-451245d20e0e6a4042159b7a988fa99029e4d8bcbce1c344b0d108e1d2af7575 WatchSource:0}: Error finding container 451245d20e0e6a4042159b7a988fa99029e4d8bcbce1c344b0d108e1d2af7575: Status 404 returned error can't find the container with id 451245d20e0e6a4042159b7a988fa99029e4d8bcbce1c344b0d108e1d2af7575 Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.333916 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm"] Mar 20 00:11:28 crc kubenswrapper[4867]: W0320 00:11:28.334423 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09c3c529_9d16_4205_bf3f_6d50c7216684.slice/crio-8824cba284790dcde523cee1932f64f5f6252295a80595a56ec435a45dca5a5e WatchSource:0}: Error finding container 8824cba284790dcde523cee1932f64f5f6252295a80595a56ec435a45dca5a5e: Status 404 returned error can't find the container with id 8824cba284790dcde523cee1932f64f5f6252295a80595a56ec435a45dca5a5e Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.390568 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" event={"ID":"296034a6-495a-44e7-bbec-41563a313938","Type":"ContainerStarted","Data":"451245d20e0e6a4042159b7a988fa99029e4d8bcbce1c344b0d108e1d2af7575"} Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.391551 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" event={"ID":"09c3c529-9d16-4205-bf3f-6d50c7216684","Type":"ContainerStarted","Data":"8824cba284790dcde523cee1932f64f5f6252295a80595a56ec435a45dca5a5e"} Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.434688 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12cd4758-795a-431c-ab92-b3156cfeb6e7" path="/var/lib/kubelet/pods/12cd4758-795a-431c-ab92-b3156cfeb6e7/volumes" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.436423 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b06883-a7a1-473b-bb3b-e8651ce58fae" path="/var/lib/kubelet/pods/21b06883-a7a1-473b-bb3b-e8651ce58fae/volumes" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.437867 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb1e4d21-3692-4e9d-914c-12cd30f641fa" path="/var/lib/kubelet/pods/fb1e4d21-3692-4e9d-914c-12cd30f641fa/volumes" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.572898 4867 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.573707 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.574515 4867 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.574781 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3" gracePeriod=15 Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.574845 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28" gracePeriod=15 Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.574874 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb" gracePeriod=15 Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.574929 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606" gracePeriod=15 Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.574953 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e" gracePeriod=15 Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.579462 4867 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 00:11:28 crc kubenswrapper[4867]: E0320 00:11:28.580243 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.580287 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 00:11:28 crc kubenswrapper[4867]: E0320 00:11:28.580330 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.580343 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 00:11:28 crc kubenswrapper[4867]: E0320 00:11:28.580358 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.580371 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 00:11:28 crc kubenswrapper[4867]: E0320 00:11:28.580394 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.580407 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 00:11:28 crc kubenswrapper[4867]: E0320 00:11:28.580425 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.580441 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 00:11:28 crc kubenswrapper[4867]: E0320 00:11:28.580471 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.580485 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 00:11:28 crc kubenswrapper[4867]: E0320 00:11:28.580568 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.580581 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 00:11:28 crc kubenswrapper[4867]: E0320 00:11:28.580603 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.580616 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 00:11:28 crc kubenswrapper[4867]: E0320 00:11:28.580641 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.580658 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.581052 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.581088 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.581116 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.581132 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.581147 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.581170 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.581184 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.581201 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.581241 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 00:11:28 crc kubenswrapper[4867]: E0320 00:11:28.581618 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.581641 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.612081 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.612469 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.612782 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.612826 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.613107 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.628215 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.714079 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.714186 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.714226 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.714248 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.714272 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.714280 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.714307 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.714330 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.714336 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.714362 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.714374 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.714394 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.714460 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.815059 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.815123 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.815135 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.815182 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.815187 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.815217 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: E0320 00:11:28.823021 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podf8444d01_ed92_4d2f_9c64_d0b084e606e3.slice/crio-conmon-25968288f06038befe97e23d983146f23df87d419a6e684f476a5672c2274aa5.scope\": RecentStats: unable to find data in memory cache]" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.867032 4867 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.867322 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Mar 20 00:11:28 crc kubenswrapper[4867]: I0320 00:11:28.929232 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:11:28 crc kubenswrapper[4867]: W0320 00:11:28.946909 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-ca7eeb85d29185d4e308c37154f960fef1a9baad4ea25914cd5c5c2b47bf54df WatchSource:0}: Error finding container ca7eeb85d29185d4e308c37154f960fef1a9baad4ea25914cd5c5c2b47bf54df: Status 404 returned error can't find the container with id ca7eeb85d29185d4e308c37154f960fef1a9baad4ea25914cd5c5c2b47bf54df Mar 20 00:11:28 crc kubenswrapper[4867]: E0320 00:11:28.950260 4867 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.230:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e64369c9e0761 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:11:28.949167969 +0000 UTC m=+303.175705526,LastTimestamp:2026-03-20 00:11:28.949167969 +0000 UTC m=+303.175705526,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:11:28 crc kubenswrapper[4867]: E0320 00:11:28.998316 4867 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.230:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e64369c9e0761 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:11:28.949167969 +0000 UTC m=+303.175705526,LastTimestamp:2026-03-20 00:11:28.949167969 +0000 UTC m=+303.175705526,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.403335 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" event={"ID":"296034a6-495a-44e7-bbec-41563a313938","Type":"ContainerStarted","Data":"25c4de1500833e024477c9ccbf5e475e68b6844149bb008abdc9f7b52e3c8524"} Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.403662 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.404167 4867 status_manager.go:851] "Failed to get status for pod" podUID="296034a6-495a-44e7-bbec-41563a313938" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-68d7f54cd4-h6jsj\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.404915 4867 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.405403 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.406598 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"40acdc06863c6228a25cb62574c3edd8e920ff2c9e042d032168039fcec0747c"} Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.406638 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ca7eeb85d29185d4e308c37154f960fef1a9baad4ea25914cd5c5c2b47bf54df"} Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.408034 4867 status_manager.go:851] "Failed to get status for pod" podUID="296034a6-495a-44e7-bbec-41563a313938" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-68d7f54cd4-h6jsj\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.408226 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" event={"ID":"09c3c529-9d16-4205-bf3f-6d50c7216684","Type":"ContainerStarted","Data":"4a45586c25b652d045749a828103146b25b09a4cd52d0019d7df8478adc77820"} Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.408468 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.408538 4867 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.408867 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.408932 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.409123 4867 status_manager.go:851] "Failed to get status for pod" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-555c8b49bd-7swdm\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.409430 4867 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.410038 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.410558 4867 status_manager.go:851] "Failed to get status for pod" podUID="296034a6-495a-44e7-bbec-41563a313938" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-68d7f54cd4-h6jsj\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.411164 4867 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.411236 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.411644 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.412086 4867 status_manager.go:851] "Failed to get status for pod" podUID="296034a6-495a-44e7-bbec-41563a313938" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-68d7f54cd4-h6jsj\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.412370 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.412591 4867 status_manager.go:851] "Failed to get status for pod" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-555c8b49bd-7swdm\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.413093 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28" exitCode=0 Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.413117 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e" exitCode=0 Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.413126 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb" exitCode=0 Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.413139 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606" exitCode=2 Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.413248 4867 scope.go:117] "RemoveContainer" containerID="ec73185eccf7f6bcc6e3bbd3217918d6031cdc5e3c72ac43fa206ea2255ad894" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.414993 4867 generic.go:334] "Generic (PLEG): container finished" podID="f8444d01-ed92-4d2f-9c64-d0b084e606e3" containerID="25968288f06038befe97e23d983146f23df87d419a6e684f476a5672c2274aa5" exitCode=0 Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.415120 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f8444d01-ed92-4d2f-9c64-d0b084e606e3","Type":"ContainerDied","Data":"25968288f06038befe97e23d983146f23df87d419a6e684f476a5672c2274aa5"} Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.415749 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.416191 4867 status_manager.go:851] "Failed to get status for pod" podUID="296034a6-495a-44e7-bbec-41563a313938" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-68d7f54cd4-h6jsj\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.416454 4867 status_manager.go:851] "Failed to get status for pod" podUID="f8444d01-ed92-4d2f-9c64-d0b084e606e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.416800 4867 status_manager.go:851] "Failed to get status for pod" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-555c8b49bd-7swdm\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.417060 4867 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.452361 4867 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 20 00:11:29 crc kubenswrapper[4867]: I0320 00:11:29.452416 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 20 00:11:30 crc kubenswrapper[4867]: E0320 00:11:30.089698 4867 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:30 crc kubenswrapper[4867]: E0320 00:11:30.090300 4867 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:30 crc kubenswrapper[4867]: E0320 00:11:30.091010 4867 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:30 crc kubenswrapper[4867]: E0320 00:11:30.091570 4867 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:30 crc kubenswrapper[4867]: E0320 00:11:30.092061 4867 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:30 crc kubenswrapper[4867]: I0320 00:11:30.092101 4867 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 00:11:30 crc kubenswrapper[4867]: E0320 00:11:30.092686 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="200ms" Mar 20 00:11:30 crc kubenswrapper[4867]: E0320 00:11:30.293408 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="400ms" Mar 20 00:11:30 crc kubenswrapper[4867]: I0320 00:11:30.409109 4867 patch_prober.go:28] interesting pod/route-controller-manager-555c8b49bd-7swdm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 00:11:30 crc kubenswrapper[4867]: I0320 00:11:30.409193 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 00:11:30 crc kubenswrapper[4867]: I0320 00:11:30.426478 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 00:11:30 crc kubenswrapper[4867]: E0320 00:11:30.694938 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="800ms" Mar 20 00:11:30 crc kubenswrapper[4867]: I0320 00:11:30.717888 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 00:11:30 crc kubenswrapper[4867]: I0320 00:11:30.718389 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:30 crc kubenswrapper[4867]: I0320 00:11:30.718703 4867 status_manager.go:851] "Failed to get status for pod" podUID="296034a6-495a-44e7-bbec-41563a313938" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-68d7f54cd4-h6jsj\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:30 crc kubenswrapper[4867]: I0320 00:11:30.718932 4867 status_manager.go:851] "Failed to get status for pod" podUID="f8444d01-ed92-4d2f-9c64-d0b084e606e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:30 crc kubenswrapper[4867]: I0320 00:11:30.719131 4867 status_manager.go:851] "Failed to get status for pod" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-555c8b49bd-7swdm\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:30 crc kubenswrapper[4867]: I0320 00:11:30.838937 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f8444d01-ed92-4d2f-9c64-d0b084e606e3-var-lock\") pod \"f8444d01-ed92-4d2f-9c64-d0b084e606e3\" (UID: \"f8444d01-ed92-4d2f-9c64-d0b084e606e3\") " Mar 20 00:11:30 crc kubenswrapper[4867]: I0320 00:11:30.839382 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8444d01-ed92-4d2f-9c64-d0b084e606e3-kube-api-access\") pod \"f8444d01-ed92-4d2f-9c64-d0b084e606e3\" (UID: \"f8444d01-ed92-4d2f-9c64-d0b084e606e3\") " Mar 20 00:11:30 crc kubenswrapper[4867]: I0320 00:11:30.839071 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8444d01-ed92-4d2f-9c64-d0b084e606e3-var-lock" (OuterVolumeSpecName: "var-lock") pod "f8444d01-ed92-4d2f-9c64-d0b084e606e3" (UID: "f8444d01-ed92-4d2f-9c64-d0b084e606e3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:11:30 crc kubenswrapper[4867]: I0320 00:11:30.839403 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8444d01-ed92-4d2f-9c64-d0b084e606e3-kubelet-dir\") pod \"f8444d01-ed92-4d2f-9c64-d0b084e606e3\" (UID: \"f8444d01-ed92-4d2f-9c64-d0b084e606e3\") " Mar 20 00:11:30 crc kubenswrapper[4867]: I0320 00:11:30.839443 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8444d01-ed92-4d2f-9c64-d0b084e606e3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f8444d01-ed92-4d2f-9c64-d0b084e606e3" (UID: "f8444d01-ed92-4d2f-9c64-d0b084e606e3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:11:30 crc kubenswrapper[4867]: I0320 00:11:30.839841 4867 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8444d01-ed92-4d2f-9c64-d0b084e606e3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:30 crc kubenswrapper[4867]: I0320 00:11:30.839865 4867 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f8444d01-ed92-4d2f-9c64-d0b084e606e3-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:30 crc kubenswrapper[4867]: I0320 00:11:30.844240 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8444d01-ed92-4d2f-9c64-d0b084e606e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f8444d01-ed92-4d2f-9c64-d0b084e606e3" (UID: "f8444d01-ed92-4d2f-9c64-d0b084e606e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:11:30 crc kubenswrapper[4867]: I0320 00:11:30.941165 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8444d01-ed92-4d2f-9c64-d0b084e606e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.428893 4867 patch_prober.go:28] interesting pod/route-controller-manager-555c8b49bd-7swdm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.428980 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.438853 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.440391 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3" exitCode=0 Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.443531 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f8444d01-ed92-4d2f-9c64-d0b084e606e3","Type":"ContainerDied","Data":"3c68b36d95239e1a6814ac6ab2ed26283627e7aed8e4b7ff8c566905a9172a89"} Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.443566 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c68b36d95239e1a6814ac6ab2ed26283627e7aed8e4b7ff8c566905a9172a89" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.443610 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.479063 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.479445 4867 status_manager.go:851] "Failed to get status for pod" podUID="296034a6-495a-44e7-bbec-41563a313938" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-68d7f54cd4-h6jsj\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.479934 4867 status_manager.go:851] "Failed to get status for pod" podUID="f8444d01-ed92-4d2f-9c64-d0b084e606e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.480405 4867 status_manager.go:851] "Failed to get status for pod" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-555c8b49bd-7swdm\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.480939 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.482084 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.482741 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.483285 4867 status_manager.go:851] "Failed to get status for pod" podUID="296034a6-495a-44e7-bbec-41563a313938" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-68d7f54cd4-h6jsj\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.483756 4867 status_manager.go:851] "Failed to get status for pod" podUID="f8444d01-ed92-4d2f-9c64-d0b084e606e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.484118 4867 status_manager.go:851] "Failed to get status for pod" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-555c8b49bd-7swdm\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.484445 4867 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:31 crc kubenswrapper[4867]: E0320 00:11:31.507651 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="1.6s" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.651221 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.651536 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.651693 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.651319 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.651806 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.651834 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.651914 4867 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.651926 4867 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:31 crc kubenswrapper[4867]: I0320 00:11:31.651934 4867 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.428785 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.453401 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.454747 4867 scope.go:117] "RemoveContainer" containerID="c47f89555a96c3b3874bf5b26a38524a2a32810c250dbb1881104a277dd46c28" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.454867 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.456132 4867 status_manager.go:851] "Failed to get status for pod" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-555c8b49bd-7swdm\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.456591 4867 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.456964 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.457272 4867 status_manager.go:851] "Failed to get status for pod" podUID="296034a6-495a-44e7-bbec-41563a313938" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-68d7f54cd4-h6jsj\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.457523 4867 status_manager.go:851] "Failed to get status for pod" podUID="f8444d01-ed92-4d2f-9c64-d0b084e606e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.463291 4867 status_manager.go:851] "Failed to get status for pod" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-555c8b49bd-7swdm\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.463832 4867 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.465912 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.466484 4867 status_manager.go:851] "Failed to get status for pod" podUID="296034a6-495a-44e7-bbec-41563a313938" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-68d7f54cd4-h6jsj\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.467270 4867 status_manager.go:851] "Failed to get status for pod" podUID="f8444d01-ed92-4d2f-9c64-d0b084e606e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.480915 4867 scope.go:117] "RemoveContainer" containerID="d7182297b3383ecd65598575f8541b023f48b4897ddada6996e614ea12a8327e" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.498609 4867 scope.go:117] "RemoveContainer" containerID="13bd7c5e1c8750f967553b14d982da45ab8f1e25d19701b7b724a06463e657fb" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.514155 4867 scope.go:117] "RemoveContainer" containerID="600ff9a9056303933733dcd3543ac6b9a9820390faedb6c6d8b1398bc6a85606" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.533730 4867 scope.go:117] "RemoveContainer" containerID="6872db1a0240c76f9562a4759a9558808ce7e80d7cc40b8efa7333b44961a7d3" Mar 20 00:11:32 crc kubenswrapper[4867]: I0320 00:11:32.563338 4867 scope.go:117] "RemoveContainer" containerID="0faeb38152f30ade43d98ca6bfa2b4dafa1ae1e7348880fe1950eabf55fbcd59" Mar 20 00:11:33 crc kubenswrapper[4867]: E0320 00:11:33.109586 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="3.2s" Mar 20 00:11:34 crc kubenswrapper[4867]: I0320 00:11:34.472844 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-68d7f54cd4-h6jsj_296034a6-495a-44e7-bbec-41563a313938/controller-manager/0.log" Mar 20 00:11:34 crc kubenswrapper[4867]: I0320 00:11:34.472890 4867 generic.go:334] "Generic (PLEG): container finished" podID="296034a6-495a-44e7-bbec-41563a313938" containerID="25c4de1500833e024477c9ccbf5e475e68b6844149bb008abdc9f7b52e3c8524" exitCode=255 Mar 20 00:11:34 crc kubenswrapper[4867]: I0320 00:11:34.472917 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" event={"ID":"296034a6-495a-44e7-bbec-41563a313938","Type":"ContainerDied","Data":"25c4de1500833e024477c9ccbf5e475e68b6844149bb008abdc9f7b52e3c8524"} Mar 20 00:11:34 crc kubenswrapper[4867]: I0320 00:11:34.473308 4867 scope.go:117] "RemoveContainer" containerID="25c4de1500833e024477c9ccbf5e475e68b6844149bb008abdc9f7b52e3c8524" Mar 20 00:11:34 crc kubenswrapper[4867]: I0320 00:11:34.473591 4867 status_manager.go:851] "Failed to get status for pod" podUID="296034a6-495a-44e7-bbec-41563a313938" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-68d7f54cd4-h6jsj\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:34 crc kubenswrapper[4867]: I0320 00:11:34.474007 4867 status_manager.go:851] "Failed to get status for pod" podUID="f8444d01-ed92-4d2f-9c64-d0b084e606e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:34 crc kubenswrapper[4867]: I0320 00:11:34.474330 4867 status_manager.go:851] "Failed to get status for pod" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-555c8b49bd-7swdm\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:34 crc kubenswrapper[4867]: I0320 00:11:34.474677 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:35 crc kubenswrapper[4867]: I0320 00:11:35.479814 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-68d7f54cd4-h6jsj_296034a6-495a-44e7-bbec-41563a313938/controller-manager/0.log" Mar 20 00:11:35 crc kubenswrapper[4867]: I0320 00:11:35.480198 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" event={"ID":"296034a6-495a-44e7-bbec-41563a313938","Type":"ContainerStarted","Data":"305d7c9e3f772786d32bcc0811d4030e70908311fd3ddbdf529717c2030d0b78"} Mar 20 00:11:35 crc kubenswrapper[4867]: I0320 00:11:35.481443 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:35 crc kubenswrapper[4867]: I0320 00:11:35.481448 4867 status_manager.go:851] "Failed to get status for pod" podUID="f8444d01-ed92-4d2f-9c64-d0b084e606e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:35 crc kubenswrapper[4867]: I0320 00:11:35.481977 4867 status_manager.go:851] "Failed to get status for pod" podUID="296034a6-495a-44e7-bbec-41563a313938" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-68d7f54cd4-h6jsj\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:35 crc kubenswrapper[4867]: I0320 00:11:35.482576 4867 status_manager.go:851] "Failed to get status for pod" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-555c8b49bd-7swdm\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:35 crc kubenswrapper[4867]: I0320 00:11:35.482979 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:35 crc kubenswrapper[4867]: I0320 00:11:35.488527 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" Mar 20 00:11:35 crc kubenswrapper[4867]: I0320 00:11:35.489025 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:35 crc kubenswrapper[4867]: I0320 00:11:35.489452 4867 status_manager.go:851] "Failed to get status for pod" podUID="296034a6-495a-44e7-bbec-41563a313938" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-68d7f54cd4-h6jsj\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:35 crc kubenswrapper[4867]: I0320 00:11:35.489954 4867 status_manager.go:851] "Failed to get status for pod" podUID="f8444d01-ed92-4d2f-9c64-d0b084e606e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:35 crc kubenswrapper[4867]: I0320 00:11:35.490396 4867 status_manager.go:851] "Failed to get status for pod" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-555c8b49bd-7swdm\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:36 crc kubenswrapper[4867]: E0320 00:11:36.310691 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="6.4s" Mar 20 00:11:36 crc kubenswrapper[4867]: I0320 00:11:36.423958 4867 status_manager.go:851] "Failed to get status for pod" podUID="296034a6-495a-44e7-bbec-41563a313938" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-68d7f54cd4-h6jsj\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:36 crc kubenswrapper[4867]: I0320 00:11:36.424570 4867 status_manager.go:851] "Failed to get status for pod" podUID="f8444d01-ed92-4d2f-9c64-d0b084e606e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:36 crc kubenswrapper[4867]: I0320 00:11:36.424932 4867 status_manager.go:851] "Failed to get status for pod" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-555c8b49bd-7swdm\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:36 crc kubenswrapper[4867]: I0320 00:11:36.425293 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:38 crc kubenswrapper[4867]: I0320 00:11:38.900381 4867 patch_prober.go:28] interesting pod/route-controller-manager-555c8b49bd-7swdm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 00:11:38 crc kubenswrapper[4867]: I0320 00:11:38.901012 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 00:11:39 crc kubenswrapper[4867]: E0320 00:11:39.000198 4867 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.230:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e64369c9e0761 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 00:11:28.949167969 +0000 UTC m=+303.175705526,LastTimestamp:2026-03-20 00:11:28.949167969 +0000 UTC m=+303.175705526,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 00:11:40 crc kubenswrapper[4867]: I0320 00:11:40.421300 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:40 crc kubenswrapper[4867]: I0320 00:11:40.422184 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:40 crc kubenswrapper[4867]: I0320 00:11:40.422508 4867 status_manager.go:851] "Failed to get status for pod" podUID="296034a6-495a-44e7-bbec-41563a313938" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-68d7f54cd4-h6jsj\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:40 crc kubenswrapper[4867]: I0320 00:11:40.422795 4867 status_manager.go:851] "Failed to get status for pod" podUID="f8444d01-ed92-4d2f-9c64-d0b084e606e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:40 crc kubenswrapper[4867]: I0320 00:11:40.423087 4867 status_manager.go:851] "Failed to get status for pod" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-555c8b49bd-7swdm\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:40 crc kubenswrapper[4867]: I0320 00:11:40.694567 4867 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89e9fa2f-fe9b-4511-92a6-d015d83f656f" Mar 20 00:11:40 crc kubenswrapper[4867]: I0320 00:11:40.694889 4867 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89e9fa2f-fe9b-4511-92a6-d015d83f656f" Mar 20 00:11:40 crc kubenswrapper[4867]: E0320 00:11:40.695696 4867 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:40 crc kubenswrapper[4867]: I0320 00:11:40.696405 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.518015 4867 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="4fde17b82172c15bee5b47fbc54f43622eb17519ce9ac4beabde8a65bc8a83e6" exitCode=0 Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.518153 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"4fde17b82172c15bee5b47fbc54f43622eb17519ce9ac4beabde8a65bc8a83e6"} Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.518224 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"aa7ac376242f0b81b01d30682e4cec464d5d6298d4acb9a9441d8ad2905f65b8"} Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.518749 4867 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89e9fa2f-fe9b-4511-92a6-d015d83f656f" Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.518780 4867 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89e9fa2f-fe9b-4511-92a6-d015d83f656f" Mar 20 00:11:41 crc kubenswrapper[4867]: E0320 00:11:41.519368 4867 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.520160 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.520694 4867 status_manager.go:851] "Failed to get status for pod" podUID="296034a6-495a-44e7-bbec-41563a313938" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-68d7f54cd4-h6jsj\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.521401 4867 status_manager.go:851] "Failed to get status for pod" podUID="f8444d01-ed92-4d2f-9c64-d0b084e606e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.521910 4867 status_manager.go:851] "Failed to get status for pod" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-555c8b49bd-7swdm\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.523844 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.524668 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.524741 4867 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c67b235ca7dadeca35706eb2eb341dc56002a0b2bd99586116914ef6d6e3ee7f" exitCode=1 Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.524781 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c67b235ca7dadeca35706eb2eb341dc56002a0b2bd99586116914ef6d6e3ee7f"} Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.525444 4867 scope.go:117] "RemoveContainer" containerID="c67b235ca7dadeca35706eb2eb341dc56002a0b2bd99586116914ef6d6e3ee7f" Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.525777 4867 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.526214 4867 status_manager.go:851] "Failed to get status for pod" podUID="296034a6-495a-44e7-bbec-41563a313938" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-68d7f54cd4-h6jsj\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.526728 4867 status_manager.go:851] "Failed to get status for pod" podUID="f8444d01-ed92-4d2f-9c64-d0b084e606e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.527354 4867 status_manager.go:851] "Failed to get status for pod" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-555c8b49bd-7swdm\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:41 crc kubenswrapper[4867]: I0320 00:11:41.528385 4867 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 20 00:11:42 crc kubenswrapper[4867]: I0320 00:11:42.557902 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8fe071653a665dc5fa0be8e3f6fbab597b3aaa37fda8a22551c64bab593df3d8"} Mar 20 00:11:42 crc kubenswrapper[4867]: I0320 00:11:42.558127 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"368dea0e1be26a56795e8f015909c9e35e1cd673580ee6a71499df30eaf38019"} Mar 20 00:11:42 crc kubenswrapper[4867]: I0320 00:11:42.558137 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a537971dcb978143d18c5835e409ce2c16f692ea5c73f17ad4218c4aa6500fe1"} Mar 20 00:11:42 crc kubenswrapper[4867]: I0320 00:11:42.573653 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 00:11:42 crc kubenswrapper[4867]: I0320 00:11:42.574523 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 00:11:42 crc kubenswrapper[4867]: I0320 00:11:42.574588 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c98325b5c515f2c825e168efa734d89b022ebd8c4fe675d3f0fa67e19305af3f"} Mar 20 00:11:43 crc kubenswrapper[4867]: I0320 00:11:43.581424 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f540df86739bcb0cb1749dbbf77ac6d56cc127ccbfde3734f9f73243272101e2"} Mar 20 00:11:43 crc kubenswrapper[4867]: I0320 00:11:43.581466 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a789e1b3bb7a641fadf995ff9755677f6cccee3b131185e89751b7c4022ad2f7"} Mar 20 00:11:43 crc kubenswrapper[4867]: I0320 00:11:43.581614 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:43 crc kubenswrapper[4867]: I0320 00:11:43.581739 4867 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89e9fa2f-fe9b-4511-92a6-d015d83f656f" Mar 20 00:11:43 crc kubenswrapper[4867]: I0320 00:11:43.581757 4867 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89e9fa2f-fe9b-4511-92a6-d015d83f656f" Mar 20 00:11:45 crc kubenswrapper[4867]: I0320 00:11:45.696916 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:45 crc kubenswrapper[4867]: I0320 00:11:45.697332 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:45 crc kubenswrapper[4867]: I0320 00:11:45.703820 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:48 crc kubenswrapper[4867]: I0320 00:11:48.350436 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:11:48 crc kubenswrapper[4867]: I0320 00:11:48.590033 4867 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:48 crc kubenswrapper[4867]: I0320 00:11:48.621584 4867 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89e9fa2f-fe9b-4511-92a6-d015d83f656f" Mar 20 00:11:48 crc kubenswrapper[4867]: I0320 00:11:48.621625 4867 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89e9fa2f-fe9b-4511-92a6-d015d83f656f" Mar 20 00:11:48 crc kubenswrapper[4867]: I0320 00:11:48.625536 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:11:48 crc kubenswrapper[4867]: I0320 00:11:48.628302 4867 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="88657110-8e88-4b68-b2f1-19a3489b7b01" Mar 20 00:11:48 crc kubenswrapper[4867]: I0320 00:11:48.900733 4867 patch_prober.go:28] interesting pod/route-controller-manager-555c8b49bd-7swdm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 00:11:48 crc kubenswrapper[4867]: I0320 00:11:48.900814 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 00:11:49 crc kubenswrapper[4867]: I0320 00:11:49.199647 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:11:49 crc kubenswrapper[4867]: I0320 00:11:49.205063 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:11:49 crc kubenswrapper[4867]: I0320 00:11:49.627760 4867 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89e9fa2f-fe9b-4511-92a6-d015d83f656f" Mar 20 00:11:49 crc kubenswrapper[4867]: I0320 00:11:49.627812 4867 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89e9fa2f-fe9b-4511-92a6-d015d83f656f" Mar 20 00:11:56 crc kubenswrapper[4867]: I0320 00:11:56.443937 4867 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="88657110-8e88-4b68-b2f1-19a3489b7b01" Mar 20 00:11:58 crc kubenswrapper[4867]: I0320 00:11:58.117823 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 00:11:58 crc kubenswrapper[4867]: I0320 00:11:58.358630 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 00:11:58 crc kubenswrapper[4867]: I0320 00:11:58.488087 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 00:11:58 crc kubenswrapper[4867]: I0320 00:11:58.880684 4867 patch_prober.go:28] interesting pod/route-controller-manager-555c8b49bd-7swdm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": read tcp 10.217.0.2:34132->10.217.0.69:8443: read: connection reset by peer" start-of-body= Mar 20 00:11:58 crc kubenswrapper[4867]: I0320 00:11:58.880734 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": read tcp 10.217.0.2:34132->10.217.0.69:8443: read: connection reset by peer" Mar 20 00:11:59 crc kubenswrapper[4867]: I0320 00:11:59.354735 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 00:11:59 crc kubenswrapper[4867]: I0320 00:11:59.461563 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 00:11:59 crc kubenswrapper[4867]: I0320 00:11:59.594800 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 00:11:59 crc kubenswrapper[4867]: I0320 00:11:59.700324 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-555c8b49bd-7swdm_09c3c529-9d16-4205-bf3f-6d50c7216684/route-controller-manager/0.log" Mar 20 00:11:59 crc kubenswrapper[4867]: I0320 00:11:59.700378 4867 generic.go:334] "Generic (PLEG): container finished" podID="09c3c529-9d16-4205-bf3f-6d50c7216684" containerID="4a45586c25b652d045749a828103146b25b09a4cd52d0019d7df8478adc77820" exitCode=255 Mar 20 00:11:59 crc kubenswrapper[4867]: I0320 00:11:59.700409 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" event={"ID":"09c3c529-9d16-4205-bf3f-6d50c7216684","Type":"ContainerDied","Data":"4a45586c25b652d045749a828103146b25b09a4cd52d0019d7df8478adc77820"} Mar 20 00:11:59 crc kubenswrapper[4867]: I0320 00:11:59.700880 4867 scope.go:117] "RemoveContainer" containerID="4a45586c25b652d045749a828103146b25b09a4cd52d0019d7df8478adc77820" Mar 20 00:11:59 crc kubenswrapper[4867]: I0320 00:11:59.901880 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 00:11:59 crc kubenswrapper[4867]: I0320 00:11:59.950623 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 00:12:00 crc kubenswrapper[4867]: I0320 00:12:00.337689 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 00:12:00 crc kubenswrapper[4867]: I0320 00:12:00.476678 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 00:12:00 crc kubenswrapper[4867]: I0320 00:12:00.707706 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-555c8b49bd-7swdm_09c3c529-9d16-4205-bf3f-6d50c7216684/route-controller-manager/0.log" Mar 20 00:12:00 crc kubenswrapper[4867]: I0320 00:12:00.707766 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" event={"ID":"09c3c529-9d16-4205-bf3f-6d50c7216684","Type":"ContainerStarted","Data":"1f23821923c3963e774e899ffb8f6a8bcf43abb8673bceb30edff1b3ed124bc8"} Mar 20 00:12:00 crc kubenswrapper[4867]: I0320 00:12:00.708087 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" Mar 20 00:12:00 crc kubenswrapper[4867]: I0320 00:12:00.733568 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 00:12:00 crc kubenswrapper[4867]: I0320 00:12:00.779565 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 00:12:00 crc kubenswrapper[4867]: I0320 00:12:00.809668 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 00:12:01 crc kubenswrapper[4867]: I0320 00:12:01.043505 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 00:12:01 crc kubenswrapper[4867]: I0320 00:12:01.434584 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 00:12:01 crc kubenswrapper[4867]: I0320 00:12:01.708088 4867 patch_prober.go:28] interesting pod/route-controller-manager-555c8b49bd-7swdm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 00:12:01 crc kubenswrapper[4867]: I0320 00:12:01.708197 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 00:12:01 crc kubenswrapper[4867]: I0320 00:12:01.708225 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 00:12:01 crc kubenswrapper[4867]: I0320 00:12:01.726465 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 00:12:01 crc kubenswrapper[4867]: I0320 00:12:01.766071 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 00:12:01 crc kubenswrapper[4867]: I0320 00:12:01.842635 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 00:12:01 crc kubenswrapper[4867]: I0320 00:12:01.995120 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.015632 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.062206 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.067525 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.489854 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.528284 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.576432 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.632276 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.662433 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.702961 4867 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.713949 4867 patch_prober.go:28] interesting pod/route-controller-manager-555c8b49bd-7swdm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.714045 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.751184 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.793783 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.795995 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.820225 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.832377 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.897791 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.946544 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 00:12:02 crc kubenswrapper[4867]: I0320 00:12:02.991907 4867 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 00:12:03 crc kubenswrapper[4867]: I0320 00:12:03.064101 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 00:12:03 crc kubenswrapper[4867]: I0320 00:12:03.069836 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 00:12:03 crc kubenswrapper[4867]: I0320 00:12:03.140578 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 00:12:03 crc kubenswrapper[4867]: I0320 00:12:03.444749 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 00:12:03 crc kubenswrapper[4867]: I0320 00:12:03.461598 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 00:12:03 crc kubenswrapper[4867]: I0320 00:12:03.471757 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 00:12:03 crc kubenswrapper[4867]: I0320 00:12:03.482322 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 00:12:03 crc kubenswrapper[4867]: I0320 00:12:03.598108 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 00:12:03 crc kubenswrapper[4867]: I0320 00:12:03.612727 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 00:12:03 crc kubenswrapper[4867]: I0320 00:12:03.646969 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 00:12:03 crc kubenswrapper[4867]: I0320 00:12:03.828734 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 00:12:03 crc kubenswrapper[4867]: I0320 00:12:03.952297 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 00:12:04 crc kubenswrapper[4867]: I0320 00:12:04.025140 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 00:12:04 crc kubenswrapper[4867]: I0320 00:12:04.025637 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 00:12:04 crc kubenswrapper[4867]: I0320 00:12:04.033338 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 00:12:04 crc kubenswrapper[4867]: I0320 00:12:04.058909 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 00:12:04 crc kubenswrapper[4867]: I0320 00:12:04.069235 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 00:12:04 crc kubenswrapper[4867]: I0320 00:12:04.103029 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 00:12:04 crc kubenswrapper[4867]: I0320 00:12:04.400579 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 00:12:04 crc kubenswrapper[4867]: I0320 00:12:04.475327 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 00:12:04 crc kubenswrapper[4867]: I0320 00:12:04.555448 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 00:12:04 crc kubenswrapper[4867]: I0320 00:12:04.596729 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 00:12:04 crc kubenswrapper[4867]: I0320 00:12:04.629414 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 00:12:04 crc kubenswrapper[4867]: I0320 00:12:04.642950 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 00:12:04 crc kubenswrapper[4867]: I0320 00:12:04.738987 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 00:12:04 crc kubenswrapper[4867]: I0320 00:12:04.794877 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 00:12:04 crc kubenswrapper[4867]: I0320 00:12:04.877091 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 00:12:04 crc kubenswrapper[4867]: I0320 00:12:04.878310 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 00:12:04 crc kubenswrapper[4867]: I0320 00:12:04.942250 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 00:12:04 crc kubenswrapper[4867]: I0320 00:12:04.976220 4867 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 00:12:05 crc kubenswrapper[4867]: I0320 00:12:05.045947 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 00:12:05 crc kubenswrapper[4867]: I0320 00:12:05.210814 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 00:12:05 crc kubenswrapper[4867]: I0320 00:12:05.293194 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 00:12:05 crc kubenswrapper[4867]: I0320 00:12:05.297915 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 00:12:05 crc kubenswrapper[4867]: I0320 00:12:05.299949 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 00:12:05 crc kubenswrapper[4867]: I0320 00:12:05.407606 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 00:12:05 crc kubenswrapper[4867]: I0320 00:12:05.449235 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 00:12:05 crc kubenswrapper[4867]: I0320 00:12:05.479322 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 00:12:05 crc kubenswrapper[4867]: I0320 00:12:05.591704 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 00:12:05 crc kubenswrapper[4867]: I0320 00:12:05.627794 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 00:12:05 crc kubenswrapper[4867]: I0320 00:12:05.657385 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 00:12:05 crc kubenswrapper[4867]: I0320 00:12:05.772635 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 00:12:05 crc kubenswrapper[4867]: I0320 00:12:05.903739 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 00:12:05 crc kubenswrapper[4867]: I0320 00:12:05.946205 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 00:12:06 crc kubenswrapper[4867]: I0320 00:12:06.017848 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 00:12:06 crc kubenswrapper[4867]: I0320 00:12:06.333254 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 00:12:06 crc kubenswrapper[4867]: I0320 00:12:06.362306 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 00:12:06 crc kubenswrapper[4867]: I0320 00:12:06.394688 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 00:12:06 crc kubenswrapper[4867]: I0320 00:12:06.415893 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 00:12:06 crc kubenswrapper[4867]: I0320 00:12:06.478069 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 00:12:06 crc kubenswrapper[4867]: I0320 00:12:06.506246 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 00:12:06 crc kubenswrapper[4867]: I0320 00:12:06.540097 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 00:12:06 crc kubenswrapper[4867]: I0320 00:12:06.576183 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 00:12:06 crc kubenswrapper[4867]: I0320 00:12:06.678385 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 00:12:06 crc kubenswrapper[4867]: I0320 00:12:06.799828 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 00:12:06 crc kubenswrapper[4867]: I0320 00:12:06.879010 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 00:12:06 crc kubenswrapper[4867]: I0320 00:12:06.907770 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 00:12:06 crc kubenswrapper[4867]: I0320 00:12:06.950751 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.006557 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.024597 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.044073 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.155243 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.165437 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.251644 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.267051 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.310882 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.371254 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.398450 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.402662 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.428067 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.533408 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.572741 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.663642 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.728122 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.814330 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.878947 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 00:12:07 crc kubenswrapper[4867]: I0320 00:12:07.947671 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 00:12:08 crc kubenswrapper[4867]: I0320 00:12:08.101673 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 00:12:08 crc kubenswrapper[4867]: I0320 00:12:08.258844 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 00:12:08 crc kubenswrapper[4867]: I0320 00:12:08.272825 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 00:12:08 crc kubenswrapper[4867]: I0320 00:12:08.436712 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 00:12:08 crc kubenswrapper[4867]: I0320 00:12:08.521601 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 00:12:08 crc kubenswrapper[4867]: I0320 00:12:08.655103 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 00:12:08 crc kubenswrapper[4867]: I0320 00:12:08.684007 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 00:12:08 crc kubenswrapper[4867]: I0320 00:12:08.705248 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 00:12:08 crc kubenswrapper[4867]: I0320 00:12:08.715752 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 00:12:08 crc kubenswrapper[4867]: I0320 00:12:08.729356 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 00:12:08 crc kubenswrapper[4867]: I0320 00:12:08.842340 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 00:12:08 crc kubenswrapper[4867]: I0320 00:12:08.886365 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 00:12:08 crc kubenswrapper[4867]: I0320 00:12:08.900376 4867 patch_prober.go:28] interesting pod/route-controller-manager-555c8b49bd-7swdm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 00:12:08 crc kubenswrapper[4867]: I0320 00:12:08.900440 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" podUID="09c3c529-9d16-4205-bf3f-6d50c7216684" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 00:12:08 crc kubenswrapper[4867]: I0320 00:12:08.903494 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.018235 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.069312 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.110093 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.112210 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.137417 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.208161 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.233623 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.279102 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.323299 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.323897 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.473407 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.592234 4867 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.594618 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.595575 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=41.595534309 podStartE2EDuration="41.595534309s" podCreationTimestamp="2026-03-20 00:11:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:11:48.430565083 +0000 UTC m=+322.657102600" watchObservedRunningTime="2026-03-20 00:12:09.595534309 +0000 UTC m=+343.822071826" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.595818 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" podStartSLOduration=43.595813526 podStartE2EDuration="43.595813526s" podCreationTimestamp="2026-03-20 00:11:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:11:48.394825907 +0000 UTC m=+322.621363424" watchObservedRunningTime="2026-03-20 00:12:09.595813526 +0000 UTC m=+343.822351033" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.596536 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-68d7f54cd4-h6jsj" podStartSLOduration=44.596485013 podStartE2EDuration="44.596485013s" podCreationTimestamp="2026-03-20 00:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:11:48.450112399 +0000 UTC m=+322.676649916" watchObservedRunningTime="2026-03-20 00:12:09.596485013 +0000 UTC m=+343.823022530" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.598781 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.598821 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566092-d4zmt","openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-54f6d9f944-qgfvv"] Mar 20 00:12:09 crc kubenswrapper[4867]: E0320 00:12:09.599068 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8444d01-ed92-4d2f-9c64-d0b084e606e3" containerName="installer" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.599094 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8444d01-ed92-4d2f-9c64-d0b084e606e3" containerName="installer" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.599195 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8444d01-ed92-4d2f-9c64-d0b084e606e3" containerName="installer" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.599211 4867 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89e9fa2f-fe9b-4511-92a6-d015d83f656f" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.599228 4867 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89e9fa2f-fe9b-4511-92a6-d015d83f656f" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.599920 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.600350 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566092-d4zmt" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.602068 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.602776 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.603400 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.603750 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.604891 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.605021 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.605095 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.605297 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.605558 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.606078 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lxzm5" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.606183 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.606206 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.606269 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.606354 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.606376 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.608904 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.622351 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.625576 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.630520 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.630477041 podStartE2EDuration="21.630477041s" podCreationTimestamp="2026-03-20 00:11:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:12:09.628938401 +0000 UTC m=+343.855475938" watchObservedRunningTime="2026-03-20 00:12:09.630477041 +0000 UTC m=+343.857014558" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.641476 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.663607 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-user-template-login\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.663662 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-router-certs\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.663691 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-service-ca\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.663717 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.663743 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.663763 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.663794 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.663819 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.663852 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-session\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.663877 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpcdp\" (UniqueName: \"kubernetes.io/projected/42de52eb-136e-4ef6-8ccb-f00f63b8c594-kube-api-access-jpcdp\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.663923 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/42de52eb-136e-4ef6-8ccb-f00f63b8c594-audit-policies\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.663951 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/42de52eb-136e-4ef6-8ccb-f00f63b8c594-audit-dir\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.663982 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42hnm\" (UniqueName: \"kubernetes.io/projected/442e78ca-a2a5-4281-9fe0-e42e0d4e1df2-kube-api-access-42hnm\") pod \"auto-csr-approver-29566092-d4zmt\" (UID: \"442e78ca-a2a5-4281-9fe0-e42e0d4e1df2\") " pod="openshift-infra/auto-csr-approver-29566092-d4zmt" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.664005 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-user-template-error\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.664042 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.744858 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.765784 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-user-template-login\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.766091 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-router-certs\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.766112 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-service-ca\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.766131 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.766152 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.766168 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.766190 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.766818 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.766865 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-session\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.766885 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpcdp\" (UniqueName: \"kubernetes.io/projected/42de52eb-136e-4ef6-8ccb-f00f63b8c594-kube-api-access-jpcdp\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.766914 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/42de52eb-136e-4ef6-8ccb-f00f63b8c594-audit-policies\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.766936 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/42de52eb-136e-4ef6-8ccb-f00f63b8c594-audit-dir\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.766961 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42hnm\" (UniqueName: \"kubernetes.io/projected/442e78ca-a2a5-4281-9fe0-e42e0d4e1df2-kube-api-access-42hnm\") pod \"auto-csr-approver-29566092-d4zmt\" (UID: \"442e78ca-a2a5-4281-9fe0-e42e0d4e1df2\") " pod="openshift-infra/auto-csr-approver-29566092-d4zmt" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.766975 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-user-template-error\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.766991 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.767112 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-service-ca\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.767419 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.767514 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/42de52eb-136e-4ef6-8ccb-f00f63b8c594-audit-dir\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.768143 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/42de52eb-136e-4ef6-8ccb-f00f63b8c594-audit-policies\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.768956 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.771515 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-user-template-login\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.771515 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-router-certs\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.771521 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.771691 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-session\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.771834 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.773559 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.775181 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-user-template-error\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.781013 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/42de52eb-136e-4ef6-8ccb-f00f63b8c594-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.783881 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpcdp\" (UniqueName: \"kubernetes.io/projected/42de52eb-136e-4ef6-8ccb-f00f63b8c594-kube-api-access-jpcdp\") pod \"oauth-openshift-54f6d9f944-qgfvv\" (UID: \"42de52eb-136e-4ef6-8ccb-f00f63b8c594\") " pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.785480 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42hnm\" (UniqueName: \"kubernetes.io/projected/442e78ca-a2a5-4281-9fe0-e42e0d4e1df2-kube-api-access-42hnm\") pod \"auto-csr-approver-29566092-d4zmt\" (UID: \"442e78ca-a2a5-4281-9fe0-e42e0d4e1df2\") " pod="openshift-infra/auto-csr-approver-29566092-d4zmt" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.824534 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.836280 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.928026 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.935904 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.945060 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566092-d4zmt" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.955626 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 00:12:09 crc kubenswrapper[4867]: I0320 00:12:09.997852 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.041814 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.057882 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.112457 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.141189 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.155215 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566092-d4zmt"] Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.318392 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.329833 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-54f6d9f944-qgfvv"] Mar 20 00:12:10 crc kubenswrapper[4867]: W0320 00:12:10.334722 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42de52eb_136e_4ef6_8ccb_f00f63b8c594.slice/crio-073e686c51a6d1f1bf0adee049ae21b419dd5a0d0501e0b68476b3e74f120a3a WatchSource:0}: Error finding container 073e686c51a6d1f1bf0adee049ae21b419dd5a0d0501e0b68476b3e74f120a3a: Status 404 returned error can't find the container with id 073e686c51a6d1f1bf0adee049ae21b419dd5a0d0501e0b68476b3e74f120a3a Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.335748 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.425893 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.440410 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.530751 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.559782 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.611589 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.661443 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.662610 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.668222 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.762305 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566092-d4zmt" event={"ID":"442e78ca-a2a5-4281-9fe0-e42e0d4e1df2","Type":"ContainerStarted","Data":"1e1c395c8370bd6485468aee6ba790a14040a57df59e5f4ebe3c861852fb83db"} Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.764179 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" event={"ID":"42de52eb-136e-4ef6-8ccb-f00f63b8c594","Type":"ContainerStarted","Data":"d71341344e18328744dc5cd0609e7da767eeed7dd58e19673e4c860fec46aebf"} Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.764208 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" event={"ID":"42de52eb-136e-4ef6-8ccb-f00f63b8c594","Type":"ContainerStarted","Data":"073e686c51a6d1f1bf0adee049ae21b419dd5a0d0501e0b68476b3e74f120a3a"} Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.764462 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.785730 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.790426 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" podStartSLOduration=69.790407903 podStartE2EDuration="1m9.790407903s" podCreationTimestamp="2026-03-20 00:11:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:12:10.789613042 +0000 UTC m=+345.016150569" watchObservedRunningTime="2026-03-20 00:12:10.790407903 +0000 UTC m=+345.016945420" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.958935 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.959395 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.965223 4867 patch_prober.go:28] interesting pod/oauth-openshift-54f6d9f944-qgfvv container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.70:6443/healthz\": read tcp 10.217.0.2:51866->10.217.0.70:6443: read: connection reset by peer" start-of-body= Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.965279 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" podUID="42de52eb-136e-4ef6-8ccb-f00f63b8c594" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.70:6443/healthz\": read tcp 10.217.0.2:51866->10.217.0.70:6443: read: connection reset by peer" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.968447 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 00:12:10 crc kubenswrapper[4867]: I0320 00:12:10.978859 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.004133 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.016678 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.042742 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.070227 4867 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.070545 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://40acdc06863c6228a25cb62574c3edd8e920ff2c9e042d032168039fcec0747c" gracePeriod=5 Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.072702 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.078551 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.122258 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.162123 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.185504 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.196647 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.275319 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.441532 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.455888 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.548937 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.556617 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.629670 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.636704 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.667749 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.761027 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.770300 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-54f6d9f944-qgfvv_42de52eb-136e-4ef6-8ccb-f00f63b8c594/oauth-openshift/0.log" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.770458 4867 generic.go:334] "Generic (PLEG): container finished" podID="42de52eb-136e-4ef6-8ccb-f00f63b8c594" containerID="d71341344e18328744dc5cd0609e7da767eeed7dd58e19673e4c860fec46aebf" exitCode=255 Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.770510 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" event={"ID":"42de52eb-136e-4ef6-8ccb-f00f63b8c594","Type":"ContainerDied","Data":"d71341344e18328744dc5cd0609e7da767eeed7dd58e19673e4c860fec46aebf"} Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.771568 4867 scope.go:117] "RemoveContainer" containerID="d71341344e18328744dc5cd0609e7da767eeed7dd58e19673e4c860fec46aebf" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.796777 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.883935 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.920409 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 00:12:11 crc kubenswrapper[4867]: I0320 00:12:11.936884 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.053183 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.055576 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.115025 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.115878 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.212543 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.242052 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.347008 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.406344 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.412631 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.679810 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.685990 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.753551 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.787510 4867 generic.go:334] "Generic (PLEG): container finished" podID="442e78ca-a2a5-4281-9fe0-e42e0d4e1df2" containerID="d6ec8864937d04a1757bdacd9ca350fdde70b7431168b255ea3664369b55f6f1" exitCode=0 Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.787703 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566092-d4zmt" event={"ID":"442e78ca-a2a5-4281-9fe0-e42e0d4e1df2","Type":"ContainerDied","Data":"d6ec8864937d04a1757bdacd9ca350fdde70b7431168b255ea3664369b55f6f1"} Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.790091 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-54f6d9f944-qgfvv_42de52eb-136e-4ef6-8ccb-f00f63b8c594/oauth-openshift/1.log" Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.792240 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-54f6d9f944-qgfvv_42de52eb-136e-4ef6-8ccb-f00f63b8c594/oauth-openshift/0.log" Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.792288 4867 generic.go:334] "Generic (PLEG): container finished" podID="42de52eb-136e-4ef6-8ccb-f00f63b8c594" containerID="c39925a15506501901ae27755afbb96d18cdedad5dd511966ab6efb86acf0cad" exitCode=255 Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.792315 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" event={"ID":"42de52eb-136e-4ef6-8ccb-f00f63b8c594","Type":"ContainerDied","Data":"c39925a15506501901ae27755afbb96d18cdedad5dd511966ab6efb86acf0cad"} Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.792336 4867 scope.go:117] "RemoveContainer" containerID="d71341344e18328744dc5cd0609e7da767eeed7dd58e19673e4c860fec46aebf" Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.793201 4867 scope.go:117] "RemoveContainer" containerID="c39925a15506501901ae27755afbb96d18cdedad5dd511966ab6efb86acf0cad" Mar 20 00:12:12 crc kubenswrapper[4867]: E0320 00:12:12.793724 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-54f6d9f944-qgfvv_openshift-authentication(42de52eb-136e-4ef6-8ccb-f00f63b8c594)\"" pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" podUID="42de52eb-136e-4ef6-8ccb-f00f63b8c594" Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.941673 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 00:12:12 crc kubenswrapper[4867]: I0320 00:12:12.996238 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 00:12:13 crc kubenswrapper[4867]: I0320 00:12:13.058278 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 00:12:13 crc kubenswrapper[4867]: I0320 00:12:13.161592 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 00:12:13 crc kubenswrapper[4867]: I0320 00:12:13.213740 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 00:12:13 crc kubenswrapper[4867]: I0320 00:12:13.216309 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 00:12:13 crc kubenswrapper[4867]: I0320 00:12:13.354141 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 00:12:13 crc kubenswrapper[4867]: I0320 00:12:13.500811 4867 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 00:12:13 crc kubenswrapper[4867]: I0320 00:12:13.573289 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 00:12:13 crc kubenswrapper[4867]: I0320 00:12:13.640568 4867 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 00:12:13 crc kubenswrapper[4867]: I0320 00:12:13.718848 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 00:12:13 crc kubenswrapper[4867]: I0320 00:12:13.751521 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 00:12:13 crc kubenswrapper[4867]: I0320 00:12:13.802316 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-54f6d9f944-qgfvv_42de52eb-136e-4ef6-8ccb-f00f63b8c594/oauth-openshift/1.log" Mar 20 00:12:13 crc kubenswrapper[4867]: I0320 00:12:13.803197 4867 scope.go:117] "RemoveContainer" containerID="c39925a15506501901ae27755afbb96d18cdedad5dd511966ab6efb86acf0cad" Mar 20 00:12:13 crc kubenswrapper[4867]: E0320 00:12:13.803382 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-54f6d9f944-qgfvv_openshift-authentication(42de52eb-136e-4ef6-8ccb-f00f63b8c594)\"" pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" podUID="42de52eb-136e-4ef6-8ccb-f00f63b8c594" Mar 20 00:12:13 crc kubenswrapper[4867]: I0320 00:12:13.854602 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 00:12:14 crc kubenswrapper[4867]: I0320 00:12:14.012337 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 00:12:14 crc kubenswrapper[4867]: I0320 00:12:14.054216 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566092-d4zmt" Mar 20 00:12:14 crc kubenswrapper[4867]: I0320 00:12:14.131739 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42hnm\" (UniqueName: \"kubernetes.io/projected/442e78ca-a2a5-4281-9fe0-e42e0d4e1df2-kube-api-access-42hnm\") pod \"442e78ca-a2a5-4281-9fe0-e42e0d4e1df2\" (UID: \"442e78ca-a2a5-4281-9fe0-e42e0d4e1df2\") " Mar 20 00:12:14 crc kubenswrapper[4867]: I0320 00:12:14.154365 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/442e78ca-a2a5-4281-9fe0-e42e0d4e1df2-kube-api-access-42hnm" (OuterVolumeSpecName: "kube-api-access-42hnm") pod "442e78ca-a2a5-4281-9fe0-e42e0d4e1df2" (UID: "442e78ca-a2a5-4281-9fe0-e42e0d4e1df2"). InnerVolumeSpecName "kube-api-access-42hnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:12:14 crc kubenswrapper[4867]: I0320 00:12:14.232557 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42hnm\" (UniqueName: \"kubernetes.io/projected/442e78ca-a2a5-4281-9fe0-e42e0d4e1df2-kube-api-access-42hnm\") on node \"crc\" DevicePath \"\"" Mar 20 00:12:14 crc kubenswrapper[4867]: I0320 00:12:14.330069 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 00:12:14 crc kubenswrapper[4867]: I0320 00:12:14.534649 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 00:12:14 crc kubenswrapper[4867]: I0320 00:12:14.599066 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 00:12:14 crc kubenswrapper[4867]: I0320 00:12:14.664674 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 00:12:14 crc kubenswrapper[4867]: I0320 00:12:14.683322 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 00:12:14 crc kubenswrapper[4867]: I0320 00:12:14.686827 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 00:12:14 crc kubenswrapper[4867]: I0320 00:12:14.810022 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566092-d4zmt" event={"ID":"442e78ca-a2a5-4281-9fe0-e42e0d4e1df2","Type":"ContainerDied","Data":"1e1c395c8370bd6485468aee6ba790a14040a57df59e5f4ebe3c861852fb83db"} Mar 20 00:12:14 crc kubenswrapper[4867]: I0320 00:12:14.810061 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e1c395c8370bd6485468aee6ba790a14040a57df59e5f4ebe3c861852fb83db" Mar 20 00:12:14 crc kubenswrapper[4867]: I0320 00:12:14.810116 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566092-d4zmt" Mar 20 00:12:14 crc kubenswrapper[4867]: I0320 00:12:14.869052 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 00:12:14 crc kubenswrapper[4867]: I0320 00:12:14.936370 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 00:12:14 crc kubenswrapper[4867]: I0320 00:12:14.937030 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 00:12:14 crc kubenswrapper[4867]: I0320 00:12:14.982543 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 00:12:15 crc kubenswrapper[4867]: I0320 00:12:15.009877 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 00:12:15 crc kubenswrapper[4867]: I0320 00:12:15.122054 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 00:12:15 crc kubenswrapper[4867]: I0320 00:12:15.184640 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 00:12:15 crc kubenswrapper[4867]: I0320 00:12:15.350128 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 00:12:15 crc kubenswrapper[4867]: I0320 00:12:15.604263 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 00:12:15 crc kubenswrapper[4867]: I0320 00:12:15.632818 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 00:12:15 crc kubenswrapper[4867]: I0320 00:12:15.642237 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 00:12:15 crc kubenswrapper[4867]: I0320 00:12:15.934938 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.114176 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.170074 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.193544 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.193636 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.204922 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.288636 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.359116 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.359223 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.359236 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.359330 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.359363 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.359335 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.359422 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.359537 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.359688 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.359887 4867 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.359918 4867 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.359937 4867 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.359957 4867 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.370667 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.372010 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.433044 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.433570 4867 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.445834 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.445875 4867 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="72cdefd8-ac0f-4bab-a339-37436c2508a1" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.452598 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.452648 4867 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="72cdefd8-ac0f-4bab-a339-37436c2508a1" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.461385 4867 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.711162 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.821937 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.821991 4867 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="40acdc06863c6228a25cb62574c3edd8e920ff2c9e042d032168039fcec0747c" exitCode=137 Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.822042 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.822073 4867 scope.go:117] "RemoveContainer" containerID="40acdc06863c6228a25cb62574c3edd8e920ff2c9e042d032168039fcec0747c" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.839616 4867 scope.go:117] "RemoveContainer" containerID="40acdc06863c6228a25cb62574c3edd8e920ff2c9e042d032168039fcec0747c" Mar 20 00:12:16 crc kubenswrapper[4867]: E0320 00:12:16.840008 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40acdc06863c6228a25cb62574c3edd8e920ff2c9e042d032168039fcec0747c\": container with ID starting with 40acdc06863c6228a25cb62574c3edd8e920ff2c9e042d032168039fcec0747c not found: ID does not exist" containerID="40acdc06863c6228a25cb62574c3edd8e920ff2c9e042d032168039fcec0747c" Mar 20 00:12:16 crc kubenswrapper[4867]: I0320 00:12:16.840097 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40acdc06863c6228a25cb62574c3edd8e920ff2c9e042d032168039fcec0747c"} err="failed to get container status \"40acdc06863c6228a25cb62574c3edd8e920ff2c9e042d032168039fcec0747c\": rpc error: code = NotFound desc = could not find container \"40acdc06863c6228a25cb62574c3edd8e920ff2c9e042d032168039fcec0747c\": container with ID starting with 40acdc06863c6228a25cb62574c3edd8e920ff2c9e042d032168039fcec0747c not found: ID does not exist" Mar 20 00:12:17 crc kubenswrapper[4867]: I0320 00:12:17.903528 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-555c8b49bd-7swdm" Mar 20 00:12:18 crc kubenswrapper[4867]: I0320 00:12:18.223390 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 00:12:19 crc kubenswrapper[4867]: I0320 00:12:19.936826 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:19 crc kubenswrapper[4867]: I0320 00:12:19.937277 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:19 crc kubenswrapper[4867]: I0320 00:12:19.938215 4867 scope.go:117] "RemoveContainer" containerID="c39925a15506501901ae27755afbb96d18cdedad5dd511966ab6efb86acf0cad" Mar 20 00:12:19 crc kubenswrapper[4867]: E0320 00:12:19.938699 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-54f6d9f944-qgfvv_openshift-authentication(42de52eb-136e-4ef6-8ccb-f00f63b8c594)\"" pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" podUID="42de52eb-136e-4ef6-8ccb-f00f63b8c594" Mar 20 00:12:34 crc kubenswrapper[4867]: I0320 00:12:34.422112 4867 scope.go:117] "RemoveContainer" containerID="c39925a15506501901ae27755afbb96d18cdedad5dd511966ab6efb86acf0cad" Mar 20 00:12:34 crc kubenswrapper[4867]: I0320 00:12:34.951790 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-54f6d9f944-qgfvv_42de52eb-136e-4ef6-8ccb-f00f63b8c594/oauth-openshift/1.log" Mar 20 00:12:34 crc kubenswrapper[4867]: I0320 00:12:34.952174 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" event={"ID":"42de52eb-136e-4ef6-8ccb-f00f63b8c594","Type":"ContainerStarted","Data":"387e9c293a53b035e6d42b07682bf2c0d576526826532ed24061c9131cff500e"} Mar 20 00:12:34 crc kubenswrapper[4867]: I0320 00:12:34.952605 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:35 crc kubenswrapper[4867]: I0320 00:12:35.168817 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-54f6d9f944-qgfvv" Mar 20 00:12:40 crc kubenswrapper[4867]: I0320 00:12:40.992810 4867 generic.go:334] "Generic (PLEG): container finished" podID="a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce" containerID="a4c8dad9a7ef27f2a3665157f06c09e88512ae2b08f48f7f52b6eea1574609ab" exitCode=0 Mar 20 00:12:40 crc kubenswrapper[4867]: I0320 00:12:40.992946 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" event={"ID":"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce","Type":"ContainerDied","Data":"a4c8dad9a7ef27f2a3665157f06c09e88512ae2b08f48f7f52b6eea1574609ab"} Mar 20 00:12:40 crc kubenswrapper[4867]: I0320 00:12:40.993554 4867 scope.go:117] "RemoveContainer" containerID="a4c8dad9a7ef27f2a3665157f06c09e88512ae2b08f48f7f52b6eea1574609ab" Mar 20 00:12:42 crc kubenswrapper[4867]: I0320 00:12:42.002130 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" event={"ID":"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce","Type":"ContainerStarted","Data":"875cafd11fb27376ca1d6f0fee8da6672aae1607d02d49f4f81b9bf70b0d8b04"} Mar 20 00:12:42 crc kubenswrapper[4867]: I0320 00:12:42.002703 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" Mar 20 00:12:42 crc kubenswrapper[4867]: I0320 00:12:42.004554 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.059614 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-c8vpm"] Mar 20 00:13:37 crc kubenswrapper[4867]: E0320 00:13:37.061882 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.061983 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 00:13:37 crc kubenswrapper[4867]: E0320 00:13:37.062062 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="442e78ca-a2a5-4281-9fe0-e42e0d4e1df2" containerName="oc" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.062133 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="442e78ca-a2a5-4281-9fe0-e42e0d4e1df2" containerName="oc" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.062318 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.062433 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="442e78ca-a2a5-4281-9fe0-e42e0d4e1df2" containerName="oc" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.062895 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.076680 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-c8vpm"] Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.253574 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d159680-38f1-4749-87c4-69b28c9a8241-registry-certificates\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.253671 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d159680-38f1-4749-87c4-69b28c9a8241-registry-tls\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.253726 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9r9m\" (UniqueName: \"kubernetes.io/projected/7d159680-38f1-4749-87c4-69b28c9a8241-kube-api-access-m9r9m\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.253765 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d159680-38f1-4749-87c4-69b28c9a8241-ca-trust-extracted\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.253816 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.253865 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d159680-38f1-4749-87c4-69b28c9a8241-trusted-ca\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.253940 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d159680-38f1-4749-87c4-69b28c9a8241-installation-pull-secrets\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.253983 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d159680-38f1-4749-87c4-69b28c9a8241-bound-sa-token\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.285954 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.354815 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d159680-38f1-4749-87c4-69b28c9a8241-installation-pull-secrets\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.355196 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d159680-38f1-4749-87c4-69b28c9a8241-bound-sa-token\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.355393 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d159680-38f1-4749-87c4-69b28c9a8241-registry-certificates\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.355519 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d159680-38f1-4749-87c4-69b28c9a8241-registry-tls\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.355606 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9r9m\" (UniqueName: \"kubernetes.io/projected/7d159680-38f1-4749-87c4-69b28c9a8241-kube-api-access-m9r9m\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.355691 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d159680-38f1-4749-87c4-69b28c9a8241-ca-trust-extracted\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.355807 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d159680-38f1-4749-87c4-69b28c9a8241-trusted-ca\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.356272 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d159680-38f1-4749-87c4-69b28c9a8241-ca-trust-extracted\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.356994 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d159680-38f1-4749-87c4-69b28c9a8241-trusted-ca\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.357425 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d159680-38f1-4749-87c4-69b28c9a8241-registry-certificates\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.363472 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d159680-38f1-4749-87c4-69b28c9a8241-registry-tls\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.363549 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d159680-38f1-4749-87c4-69b28c9a8241-installation-pull-secrets\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.376540 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d159680-38f1-4749-87c4-69b28c9a8241-bound-sa-token\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.380080 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9r9m\" (UniqueName: \"kubernetes.io/projected/7d159680-38f1-4749-87c4-69b28c9a8241-kube-api-access-m9r9m\") pod \"image-registry-66df7c8f76-c8vpm\" (UID: \"7d159680-38f1-4749-87c4-69b28c9a8241\") " pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.382547 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:37 crc kubenswrapper[4867]: I0320 00:13:37.830695 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-c8vpm"] Mar 20 00:13:38 crc kubenswrapper[4867]: I0320 00:13:38.406904 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" event={"ID":"7d159680-38f1-4749-87c4-69b28c9a8241","Type":"ContainerStarted","Data":"663838d54299b6f27400f3d5b873c71e641f6d748360d98ba1613105973ce532"} Mar 20 00:13:38 crc kubenswrapper[4867]: I0320 00:13:38.407486 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" event={"ID":"7d159680-38f1-4749-87c4-69b28c9a8241","Type":"ContainerStarted","Data":"4763795aa2c473c3ce3fbef87c88d3662ccbda2120d7afdd346718d48d51c12f"} Mar 20 00:13:38 crc kubenswrapper[4867]: I0320 00:13:38.407801 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:38 crc kubenswrapper[4867]: I0320 00:13:38.429113 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" podStartSLOduration=1.4290899989999999 podStartE2EDuration="1.429089999s" podCreationTimestamp="2026-03-20 00:13:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:13:38.42417857 +0000 UTC m=+432.650716087" watchObservedRunningTime="2026-03-20 00:13:38.429089999 +0000 UTC m=+432.655627516" Mar 20 00:13:43 crc kubenswrapper[4867]: I0320 00:13:43.896790 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w9sbs"] Mar 20 00:13:43 crc kubenswrapper[4867]: I0320 00:13:43.897591 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w9sbs" podUID="58c1b369-ec0e-43d0-a3ca-8c2b7b74d337" containerName="registry-server" containerID="cri-o://cde655d824a83bb005be7983641b5234c7f9c75b60511b9babdf303ecbad22cd" gracePeriod=30 Mar 20 00:13:43 crc kubenswrapper[4867]: I0320 00:13:43.900479 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fsj5n"] Mar 20 00:13:43 crc kubenswrapper[4867]: I0320 00:13:43.900660 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fsj5n" podUID="61384875-b5b9-4757-839c-2071e973510c" containerName="registry-server" containerID="cri-o://8d8cea14d46a5de6ea7826a31b0f9da5bef51ba9da2ee4c4413a1619271fa6e1" gracePeriod=30 Mar 20 00:13:43 crc kubenswrapper[4867]: I0320 00:13:43.913900 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4fhrs"] Mar 20 00:13:43 crc kubenswrapper[4867]: I0320 00:13:43.914209 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" podUID="a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce" containerName="marketplace-operator" containerID="cri-o://875cafd11fb27376ca1d6f0fee8da6672aae1607d02d49f4f81b9bf70b0d8b04" gracePeriod=30 Mar 20 00:13:43 crc kubenswrapper[4867]: I0320 00:13:43.935077 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w9fsp"] Mar 20 00:13:43 crc kubenswrapper[4867]: I0320 00:13:43.936192 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w9fsp" Mar 20 00:13:43 crc kubenswrapper[4867]: I0320 00:13:43.949148 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjkcm"] Mar 20 00:13:43 crc kubenswrapper[4867]: I0320 00:13:43.949699 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kjkcm" podUID="8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9" containerName="registry-server" containerID="cri-o://caecc798f336bb322c4ba6033a10318683802d22cf0bc335fda8355bf0e545f6" gracePeriod=30 Mar 20 00:13:43 crc kubenswrapper[4867]: I0320 00:13:43.967400 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mkf6s"] Mar 20 00:13:43 crc kubenswrapper[4867]: I0320 00:13:43.967681 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mkf6s" podUID="b08fdffb-6c98-4f30-b70b-1592c40e01dc" containerName="registry-server" containerID="cri-o://9b98619835a00f8b5a6d242d80093903415f3b604374fee52c2112448554f1c5" gracePeriod=30 Mar 20 00:13:43 crc kubenswrapper[4867]: I0320 00:13:43.974296 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w9fsp"] Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.049074 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d521d536-d270-40aa-9c6e-e80b679d1ecd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w9fsp\" (UID: \"d521d536-d270-40aa-9c6e-e80b679d1ecd\") " pod="openshift-marketplace/marketplace-operator-79b997595-w9fsp" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.049394 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4sqn\" (UniqueName: \"kubernetes.io/projected/d521d536-d270-40aa-9c6e-e80b679d1ecd-kube-api-access-p4sqn\") pod \"marketplace-operator-79b997595-w9fsp\" (UID: \"d521d536-d270-40aa-9c6e-e80b679d1ecd\") " pod="openshift-marketplace/marketplace-operator-79b997595-w9fsp" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.049429 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d521d536-d270-40aa-9c6e-e80b679d1ecd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w9fsp\" (UID: \"d521d536-d270-40aa-9c6e-e80b679d1ecd\") " pod="openshift-marketplace/marketplace-operator-79b997595-w9fsp" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.150773 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4sqn\" (UniqueName: \"kubernetes.io/projected/d521d536-d270-40aa-9c6e-e80b679d1ecd-kube-api-access-p4sqn\") pod \"marketplace-operator-79b997595-w9fsp\" (UID: \"d521d536-d270-40aa-9c6e-e80b679d1ecd\") " pod="openshift-marketplace/marketplace-operator-79b997595-w9fsp" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.150826 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d521d536-d270-40aa-9c6e-e80b679d1ecd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w9fsp\" (UID: \"d521d536-d270-40aa-9c6e-e80b679d1ecd\") " pod="openshift-marketplace/marketplace-operator-79b997595-w9fsp" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.150888 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d521d536-d270-40aa-9c6e-e80b679d1ecd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w9fsp\" (UID: \"d521d536-d270-40aa-9c6e-e80b679d1ecd\") " pod="openshift-marketplace/marketplace-operator-79b997595-w9fsp" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.152010 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d521d536-d270-40aa-9c6e-e80b679d1ecd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w9fsp\" (UID: \"d521d536-d270-40aa-9c6e-e80b679d1ecd\") " pod="openshift-marketplace/marketplace-operator-79b997595-w9fsp" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.165222 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d521d536-d270-40aa-9c6e-e80b679d1ecd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w9fsp\" (UID: \"d521d536-d270-40aa-9c6e-e80b679d1ecd\") " pod="openshift-marketplace/marketplace-operator-79b997595-w9fsp" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.165740 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4sqn\" (UniqueName: \"kubernetes.io/projected/d521d536-d270-40aa-9c6e-e80b679d1ecd-kube-api-access-p4sqn\") pod \"marketplace-operator-79b997595-w9fsp\" (UID: \"d521d536-d270-40aa-9c6e-e80b679d1ecd\") " pod="openshift-marketplace/marketplace-operator-79b997595-w9fsp" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.253885 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w9fsp" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.377885 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9sbs" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.379612 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.408798 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjkcm" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.414711 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mkf6s" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.439492 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fsj5n" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.440673 4867 generic.go:334] "Generic (PLEG): container finished" podID="61384875-b5b9-4757-839c-2071e973510c" containerID="8d8cea14d46a5de6ea7826a31b0f9da5bef51ba9da2ee4c4413a1619271fa6e1" exitCode=0 Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.444259 4867 generic.go:334] "Generic (PLEG): container finished" podID="8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9" containerID="caecc798f336bb322c4ba6033a10318683802d22cf0bc335fda8355bf0e545f6" exitCode=0 Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.444371 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjkcm" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.454876 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsj5n" event={"ID":"61384875-b5b9-4757-839c-2071e973510c","Type":"ContainerDied","Data":"8d8cea14d46a5de6ea7826a31b0f9da5bef51ba9da2ee4c4413a1619271fa6e1"} Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.454938 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsj5n" event={"ID":"61384875-b5b9-4757-839c-2071e973510c","Type":"ContainerDied","Data":"170a7ed892dcd4be1cf3dfcd9718cb8365cff5852358ca07c477b59a62324da5"} Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.454955 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjkcm" event={"ID":"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9","Type":"ContainerDied","Data":"caecc798f336bb322c4ba6033a10318683802d22cf0bc335fda8355bf0e545f6"} Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.454969 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjkcm" event={"ID":"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9","Type":"ContainerDied","Data":"da6862ff6d5baf84be583fff63754c8506dc27d7b0583f3ec334ab28c3f096e6"} Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.454984 4867 scope.go:117] "RemoveContainer" containerID="8d8cea14d46a5de6ea7826a31b0f9da5bef51ba9da2ee4c4413a1619271fa6e1" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.456009 4867 generic.go:334] "Generic (PLEG): container finished" podID="58c1b369-ec0e-43d0-a3ca-8c2b7b74d337" containerID="cde655d824a83bb005be7983641b5234c7f9c75b60511b9babdf303ecbad22cd" exitCode=0 Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.456096 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9sbs" event={"ID":"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337","Type":"ContainerDied","Data":"cde655d824a83bb005be7983641b5234c7f9c75b60511b9babdf303ecbad22cd"} Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.456120 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9sbs" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.456134 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9sbs" event={"ID":"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337","Type":"ContainerDied","Data":"70f0593448c8127dfd3c2038c46a9a06d4aa0de28c09ab3dd181d27cc156e6a5"} Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.456581 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337-utilities\") pod \"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337\" (UID: \"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337\") " Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.456615 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce-marketplace-operator-metrics\") pod \"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce\" (UID: \"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce\") " Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.456647 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce-marketplace-trusted-ca\") pod \"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce\" (UID: \"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce\") " Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.456664 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg9hh\" (UniqueName: \"kubernetes.io/projected/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337-kube-api-access-rg9hh\") pod \"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337\" (UID: \"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337\") " Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.456714 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337-catalog-content\") pod \"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337\" (UID: \"58c1b369-ec0e-43d0-a3ca-8c2b7b74d337\") " Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.456747 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzph2\" (UniqueName: \"kubernetes.io/projected/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce-kube-api-access-hzph2\") pod \"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce\" (UID: \"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce\") " Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.457993 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce" (UID: "a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.458415 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337-utilities" (OuterVolumeSpecName: "utilities") pod "58c1b369-ec0e-43d0-a3ca-8c2b7b74d337" (UID: "58c1b369-ec0e-43d0-a3ca-8c2b7b74d337"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.461672 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337-kube-api-access-rg9hh" (OuterVolumeSpecName: "kube-api-access-rg9hh") pod "58c1b369-ec0e-43d0-a3ca-8c2b7b74d337" (UID: "58c1b369-ec0e-43d0-a3ca-8c2b7b74d337"). InnerVolumeSpecName "kube-api-access-rg9hh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.461801 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce-kube-api-access-hzph2" (OuterVolumeSpecName: "kube-api-access-hzph2") pod "a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce" (UID: "a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce"). InnerVolumeSpecName "kube-api-access-hzph2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.461799 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce" (UID: "a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.463164 4867 generic.go:334] "Generic (PLEG): container finished" podID="a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce" containerID="875cafd11fb27376ca1d6f0fee8da6672aae1607d02d49f4f81b9bf70b0d8b04" exitCode=0 Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.463259 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.463715 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" event={"ID":"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce","Type":"ContainerDied","Data":"875cafd11fb27376ca1d6f0fee8da6672aae1607d02d49f4f81b9bf70b0d8b04"} Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.463742 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4fhrs" event={"ID":"a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce","Type":"ContainerDied","Data":"c433bfbe70451694c72c12339b685ec3a8df0ed54078619d99111767fe3144fb"} Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.471885 4867 scope.go:117] "RemoveContainer" containerID="995341c81058f227c09a06b8a1ee48b56cbe4c4057c9ef92186c5fce49c9928d" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.489399 4867 generic.go:334] "Generic (PLEG): container finished" podID="b08fdffb-6c98-4f30-b70b-1592c40e01dc" containerID="9b98619835a00f8b5a6d242d80093903415f3b604374fee52c2112448554f1c5" exitCode=0 Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.489448 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkf6s" event={"ID":"b08fdffb-6c98-4f30-b70b-1592c40e01dc","Type":"ContainerDied","Data":"9b98619835a00f8b5a6d242d80093903415f3b604374fee52c2112448554f1c5"} Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.489477 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkf6s" event={"ID":"b08fdffb-6c98-4f30-b70b-1592c40e01dc","Type":"ContainerDied","Data":"e4ab5e6e53da4a0456e3f75fc9882b818273c81331f7388430274d4a5f5eb8ee"} Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.489560 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mkf6s" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.497090 4867 scope.go:117] "RemoveContainer" containerID="f6dd7d22c930181891391f29c6f10cd35142159a758fdad9d22960b644f952e9" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.510652 4867 scope.go:117] "RemoveContainer" containerID="8d8cea14d46a5de6ea7826a31b0f9da5bef51ba9da2ee4c4413a1619271fa6e1" Mar 20 00:13:44 crc kubenswrapper[4867]: E0320 00:13:44.511596 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d8cea14d46a5de6ea7826a31b0f9da5bef51ba9da2ee4c4413a1619271fa6e1\": container with ID starting with 8d8cea14d46a5de6ea7826a31b0f9da5bef51ba9da2ee4c4413a1619271fa6e1 not found: ID does not exist" containerID="8d8cea14d46a5de6ea7826a31b0f9da5bef51ba9da2ee4c4413a1619271fa6e1" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.511624 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d8cea14d46a5de6ea7826a31b0f9da5bef51ba9da2ee4c4413a1619271fa6e1"} err="failed to get container status \"8d8cea14d46a5de6ea7826a31b0f9da5bef51ba9da2ee4c4413a1619271fa6e1\": rpc error: code = NotFound desc = could not find container \"8d8cea14d46a5de6ea7826a31b0f9da5bef51ba9da2ee4c4413a1619271fa6e1\": container with ID starting with 8d8cea14d46a5de6ea7826a31b0f9da5bef51ba9da2ee4c4413a1619271fa6e1 not found: ID does not exist" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.511645 4867 scope.go:117] "RemoveContainer" containerID="995341c81058f227c09a06b8a1ee48b56cbe4c4057c9ef92186c5fce49c9928d" Mar 20 00:13:44 crc kubenswrapper[4867]: E0320 00:13:44.513551 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"995341c81058f227c09a06b8a1ee48b56cbe4c4057c9ef92186c5fce49c9928d\": container with ID starting with 995341c81058f227c09a06b8a1ee48b56cbe4c4057c9ef92186c5fce49c9928d not found: ID does not exist" containerID="995341c81058f227c09a06b8a1ee48b56cbe4c4057c9ef92186c5fce49c9928d" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.513576 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"995341c81058f227c09a06b8a1ee48b56cbe4c4057c9ef92186c5fce49c9928d"} err="failed to get container status \"995341c81058f227c09a06b8a1ee48b56cbe4c4057c9ef92186c5fce49c9928d\": rpc error: code = NotFound desc = could not find container \"995341c81058f227c09a06b8a1ee48b56cbe4c4057c9ef92186c5fce49c9928d\": container with ID starting with 995341c81058f227c09a06b8a1ee48b56cbe4c4057c9ef92186c5fce49c9928d not found: ID does not exist" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.513617 4867 scope.go:117] "RemoveContainer" containerID="f6dd7d22c930181891391f29c6f10cd35142159a758fdad9d22960b644f952e9" Mar 20 00:13:44 crc kubenswrapper[4867]: E0320 00:13:44.515073 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6dd7d22c930181891391f29c6f10cd35142159a758fdad9d22960b644f952e9\": container with ID starting with f6dd7d22c930181891391f29c6f10cd35142159a758fdad9d22960b644f952e9 not found: ID does not exist" containerID="f6dd7d22c930181891391f29c6f10cd35142159a758fdad9d22960b644f952e9" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.515399 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6dd7d22c930181891391f29c6f10cd35142159a758fdad9d22960b644f952e9"} err="failed to get container status \"f6dd7d22c930181891391f29c6f10cd35142159a758fdad9d22960b644f952e9\": rpc error: code = NotFound desc = could not find container \"f6dd7d22c930181891391f29c6f10cd35142159a758fdad9d22960b644f952e9\": container with ID starting with f6dd7d22c930181891391f29c6f10cd35142159a758fdad9d22960b644f952e9 not found: ID does not exist" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.515605 4867 scope.go:117] "RemoveContainer" containerID="caecc798f336bb322c4ba6033a10318683802d22cf0bc335fda8355bf0e545f6" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.517975 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4fhrs"] Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.520648 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4fhrs"] Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.526701 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58c1b369-ec0e-43d0-a3ca-8c2b7b74d337" (UID: "58c1b369-ec0e-43d0-a3ca-8c2b7b74d337"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.537304 4867 scope.go:117] "RemoveContainer" containerID="d13bea056258fe86c712ef6c22999bb0fa33cf8e1c2ba47cd3a1abfd4ba9bea4" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.557946 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61384875-b5b9-4757-839c-2071e973510c-utilities\") pod \"61384875-b5b9-4757-839c-2071e973510c\" (UID: \"61384875-b5b9-4757-839c-2071e973510c\") " Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.558001 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgw7w\" (UniqueName: \"kubernetes.io/projected/b08fdffb-6c98-4f30-b70b-1592c40e01dc-kube-api-access-mgw7w\") pod \"b08fdffb-6c98-4f30-b70b-1592c40e01dc\" (UID: \"b08fdffb-6c98-4f30-b70b-1592c40e01dc\") " Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.558075 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61384875-b5b9-4757-839c-2071e973510c-catalog-content\") pod \"61384875-b5b9-4757-839c-2071e973510c\" (UID: \"61384875-b5b9-4757-839c-2071e973510c\") " Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.558116 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gghsf\" (UniqueName: \"kubernetes.io/projected/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9-kube-api-access-gghsf\") pod \"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9\" (UID: \"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9\") " Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.558142 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b08fdffb-6c98-4f30-b70b-1592c40e01dc-catalog-content\") pod \"b08fdffb-6c98-4f30-b70b-1592c40e01dc\" (UID: \"b08fdffb-6c98-4f30-b70b-1592c40e01dc\") " Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.558164 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9-catalog-content\") pod \"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9\" (UID: \"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9\") " Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.558188 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5v5h\" (UniqueName: \"kubernetes.io/projected/61384875-b5b9-4757-839c-2071e973510c-kube-api-access-s5v5h\") pod \"61384875-b5b9-4757-839c-2071e973510c\" (UID: \"61384875-b5b9-4757-839c-2071e973510c\") " Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.558225 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b08fdffb-6c98-4f30-b70b-1592c40e01dc-utilities\") pod \"b08fdffb-6c98-4f30-b70b-1592c40e01dc\" (UID: \"b08fdffb-6c98-4f30-b70b-1592c40e01dc\") " Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.558261 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9-utilities\") pod \"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9\" (UID: \"8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9\") " Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.558597 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.558612 4867 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.558622 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg9hh\" (UniqueName: \"kubernetes.io/projected/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337-kube-api-access-rg9hh\") on node \"crc\" DevicePath \"\"" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.558633 4867 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.558641 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.558649 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzph2\" (UniqueName: \"kubernetes.io/projected/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce-kube-api-access-hzph2\") on node \"crc\" DevicePath \"\"" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.559577 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61384875-b5b9-4757-839c-2071e973510c-utilities" (OuterVolumeSpecName: "utilities") pod "61384875-b5b9-4757-839c-2071e973510c" (UID: "61384875-b5b9-4757-839c-2071e973510c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.561699 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b08fdffb-6c98-4f30-b70b-1592c40e01dc-utilities" (OuterVolumeSpecName: "utilities") pod "b08fdffb-6c98-4f30-b70b-1592c40e01dc" (UID: "b08fdffb-6c98-4f30-b70b-1592c40e01dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.563829 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08fdffb-6c98-4f30-b70b-1592c40e01dc-kube-api-access-mgw7w" (OuterVolumeSpecName: "kube-api-access-mgw7w") pod "b08fdffb-6c98-4f30-b70b-1592c40e01dc" (UID: "b08fdffb-6c98-4f30-b70b-1592c40e01dc"). InnerVolumeSpecName "kube-api-access-mgw7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.563851 4867 scope.go:117] "RemoveContainer" containerID="5a14cd334ae76bb90f0bb1e887fc5287680954c1e98e974923f70f587ca0f657" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.565232 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61384875-b5b9-4757-839c-2071e973510c-kube-api-access-s5v5h" (OuterVolumeSpecName: "kube-api-access-s5v5h") pod "61384875-b5b9-4757-839c-2071e973510c" (UID: "61384875-b5b9-4757-839c-2071e973510c"). InnerVolumeSpecName "kube-api-access-s5v5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.565843 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9-utilities" (OuterVolumeSpecName: "utilities") pod "8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9" (UID: "8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.568701 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9-kube-api-access-gghsf" (OuterVolumeSpecName: "kube-api-access-gghsf") pod "8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9" (UID: "8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9"). InnerVolumeSpecName "kube-api-access-gghsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.577653 4867 scope.go:117] "RemoveContainer" containerID="caecc798f336bb322c4ba6033a10318683802d22cf0bc335fda8355bf0e545f6" Mar 20 00:13:44 crc kubenswrapper[4867]: E0320 00:13:44.578051 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caecc798f336bb322c4ba6033a10318683802d22cf0bc335fda8355bf0e545f6\": container with ID starting with caecc798f336bb322c4ba6033a10318683802d22cf0bc335fda8355bf0e545f6 not found: ID does not exist" containerID="caecc798f336bb322c4ba6033a10318683802d22cf0bc335fda8355bf0e545f6" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.578079 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caecc798f336bb322c4ba6033a10318683802d22cf0bc335fda8355bf0e545f6"} err="failed to get container status \"caecc798f336bb322c4ba6033a10318683802d22cf0bc335fda8355bf0e545f6\": rpc error: code = NotFound desc = could not find container \"caecc798f336bb322c4ba6033a10318683802d22cf0bc335fda8355bf0e545f6\": container with ID starting with caecc798f336bb322c4ba6033a10318683802d22cf0bc335fda8355bf0e545f6 not found: ID does not exist" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.578098 4867 scope.go:117] "RemoveContainer" containerID="d13bea056258fe86c712ef6c22999bb0fa33cf8e1c2ba47cd3a1abfd4ba9bea4" Mar 20 00:13:44 crc kubenswrapper[4867]: E0320 00:13:44.578296 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d13bea056258fe86c712ef6c22999bb0fa33cf8e1c2ba47cd3a1abfd4ba9bea4\": container with ID starting with d13bea056258fe86c712ef6c22999bb0fa33cf8e1c2ba47cd3a1abfd4ba9bea4 not found: ID does not exist" containerID="d13bea056258fe86c712ef6c22999bb0fa33cf8e1c2ba47cd3a1abfd4ba9bea4" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.578320 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d13bea056258fe86c712ef6c22999bb0fa33cf8e1c2ba47cd3a1abfd4ba9bea4"} err="failed to get container status \"d13bea056258fe86c712ef6c22999bb0fa33cf8e1c2ba47cd3a1abfd4ba9bea4\": rpc error: code = NotFound desc = could not find container \"d13bea056258fe86c712ef6c22999bb0fa33cf8e1c2ba47cd3a1abfd4ba9bea4\": container with ID starting with d13bea056258fe86c712ef6c22999bb0fa33cf8e1c2ba47cd3a1abfd4ba9bea4 not found: ID does not exist" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.578339 4867 scope.go:117] "RemoveContainer" containerID="5a14cd334ae76bb90f0bb1e887fc5287680954c1e98e974923f70f587ca0f657" Mar 20 00:13:44 crc kubenswrapper[4867]: E0320 00:13:44.578524 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a14cd334ae76bb90f0bb1e887fc5287680954c1e98e974923f70f587ca0f657\": container with ID starting with 5a14cd334ae76bb90f0bb1e887fc5287680954c1e98e974923f70f587ca0f657 not found: ID does not exist" containerID="5a14cd334ae76bb90f0bb1e887fc5287680954c1e98e974923f70f587ca0f657" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.578543 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a14cd334ae76bb90f0bb1e887fc5287680954c1e98e974923f70f587ca0f657"} err="failed to get container status \"5a14cd334ae76bb90f0bb1e887fc5287680954c1e98e974923f70f587ca0f657\": rpc error: code = NotFound desc = could not find container \"5a14cd334ae76bb90f0bb1e887fc5287680954c1e98e974923f70f587ca0f657\": container with ID starting with 5a14cd334ae76bb90f0bb1e887fc5287680954c1e98e974923f70f587ca0f657 not found: ID does not exist" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.578556 4867 scope.go:117] "RemoveContainer" containerID="cde655d824a83bb005be7983641b5234c7f9c75b60511b9babdf303ecbad22cd" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.594356 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9" (UID: "8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.598667 4867 scope.go:117] "RemoveContainer" containerID="7dbf9eddb496d05913c8f26c5ff8de0053f76c8baf9d3e1cac5cb45fce6ea95a" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.620794 4867 scope.go:117] "RemoveContainer" containerID="3ce0310b792b7d8046eef2176c5049a1431e9378502a417854886f883f3c178a" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.622839 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61384875-b5b9-4757-839c-2071e973510c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61384875-b5b9-4757-839c-2071e973510c" (UID: "61384875-b5b9-4757-839c-2071e973510c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.656987 4867 scope.go:117] "RemoveContainer" containerID="cde655d824a83bb005be7983641b5234c7f9c75b60511b9babdf303ecbad22cd" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.659430 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gghsf\" (UniqueName: \"kubernetes.io/projected/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9-kube-api-access-gghsf\") on node \"crc\" DevicePath \"\"" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.659451 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:13:44 crc kubenswrapper[4867]: E0320 00:13:44.659550 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde655d824a83bb005be7983641b5234c7f9c75b60511b9babdf303ecbad22cd\": container with ID starting with cde655d824a83bb005be7983641b5234c7f9c75b60511b9babdf303ecbad22cd not found: ID does not exist" containerID="cde655d824a83bb005be7983641b5234c7f9c75b60511b9babdf303ecbad22cd" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.659577 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde655d824a83bb005be7983641b5234c7f9c75b60511b9babdf303ecbad22cd"} err="failed to get container status \"cde655d824a83bb005be7983641b5234c7f9c75b60511b9babdf303ecbad22cd\": rpc error: code = NotFound desc = could not find container \"cde655d824a83bb005be7983641b5234c7f9c75b60511b9babdf303ecbad22cd\": container with ID starting with cde655d824a83bb005be7983641b5234c7f9c75b60511b9babdf303ecbad22cd not found: ID does not exist" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.659596 4867 scope.go:117] "RemoveContainer" containerID="7dbf9eddb496d05913c8f26c5ff8de0053f76c8baf9d3e1cac5cb45fce6ea95a" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.659644 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5v5h\" (UniqueName: \"kubernetes.io/projected/61384875-b5b9-4757-839c-2071e973510c-kube-api-access-s5v5h\") on node \"crc\" DevicePath \"\"" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.659655 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b08fdffb-6c98-4f30-b70b-1592c40e01dc-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.659665 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.659673 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61384875-b5b9-4757-839c-2071e973510c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.659682 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgw7w\" (UniqueName: \"kubernetes.io/projected/b08fdffb-6c98-4f30-b70b-1592c40e01dc-kube-api-access-mgw7w\") on node \"crc\" DevicePath \"\"" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.659691 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61384875-b5b9-4757-839c-2071e973510c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:13:44 crc kubenswrapper[4867]: E0320 00:13:44.660141 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dbf9eddb496d05913c8f26c5ff8de0053f76c8baf9d3e1cac5cb45fce6ea95a\": container with ID starting with 7dbf9eddb496d05913c8f26c5ff8de0053f76c8baf9d3e1cac5cb45fce6ea95a not found: ID does not exist" containerID="7dbf9eddb496d05913c8f26c5ff8de0053f76c8baf9d3e1cac5cb45fce6ea95a" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.660202 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dbf9eddb496d05913c8f26c5ff8de0053f76c8baf9d3e1cac5cb45fce6ea95a"} err="failed to get container status \"7dbf9eddb496d05913c8f26c5ff8de0053f76c8baf9d3e1cac5cb45fce6ea95a\": rpc error: code = NotFound desc = could not find container \"7dbf9eddb496d05913c8f26c5ff8de0053f76c8baf9d3e1cac5cb45fce6ea95a\": container with ID starting with 7dbf9eddb496d05913c8f26c5ff8de0053f76c8baf9d3e1cac5cb45fce6ea95a not found: ID does not exist" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.660240 4867 scope.go:117] "RemoveContainer" containerID="3ce0310b792b7d8046eef2176c5049a1431e9378502a417854886f883f3c178a" Mar 20 00:13:44 crc kubenswrapper[4867]: E0320 00:13:44.660537 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ce0310b792b7d8046eef2176c5049a1431e9378502a417854886f883f3c178a\": container with ID starting with 3ce0310b792b7d8046eef2176c5049a1431e9378502a417854886f883f3c178a not found: ID does not exist" containerID="3ce0310b792b7d8046eef2176c5049a1431e9378502a417854886f883f3c178a" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.660560 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce0310b792b7d8046eef2176c5049a1431e9378502a417854886f883f3c178a"} err="failed to get container status \"3ce0310b792b7d8046eef2176c5049a1431e9378502a417854886f883f3c178a\": rpc error: code = NotFound desc = could not find container \"3ce0310b792b7d8046eef2176c5049a1431e9378502a417854886f883f3c178a\": container with ID starting with 3ce0310b792b7d8046eef2176c5049a1431e9378502a417854886f883f3c178a not found: ID does not exist" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.660574 4867 scope.go:117] "RemoveContainer" containerID="875cafd11fb27376ca1d6f0fee8da6672aae1607d02d49f4f81b9bf70b0d8b04" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.674230 4867 scope.go:117] "RemoveContainer" containerID="a4c8dad9a7ef27f2a3665157f06c09e88512ae2b08f48f7f52b6eea1574609ab" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.697733 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b08fdffb-6c98-4f30-b70b-1592c40e01dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b08fdffb-6c98-4f30-b70b-1592c40e01dc" (UID: "b08fdffb-6c98-4f30-b70b-1592c40e01dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.700479 4867 scope.go:117] "RemoveContainer" containerID="875cafd11fb27376ca1d6f0fee8da6672aae1607d02d49f4f81b9bf70b0d8b04" Mar 20 00:13:44 crc kubenswrapper[4867]: E0320 00:13:44.701148 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"875cafd11fb27376ca1d6f0fee8da6672aae1607d02d49f4f81b9bf70b0d8b04\": container with ID starting with 875cafd11fb27376ca1d6f0fee8da6672aae1607d02d49f4f81b9bf70b0d8b04 not found: ID does not exist" containerID="875cafd11fb27376ca1d6f0fee8da6672aae1607d02d49f4f81b9bf70b0d8b04" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.701201 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"875cafd11fb27376ca1d6f0fee8da6672aae1607d02d49f4f81b9bf70b0d8b04"} err="failed to get container status \"875cafd11fb27376ca1d6f0fee8da6672aae1607d02d49f4f81b9bf70b0d8b04\": rpc error: code = NotFound desc = could not find container \"875cafd11fb27376ca1d6f0fee8da6672aae1607d02d49f4f81b9bf70b0d8b04\": container with ID starting with 875cafd11fb27376ca1d6f0fee8da6672aae1607d02d49f4f81b9bf70b0d8b04 not found: ID does not exist" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.701222 4867 scope.go:117] "RemoveContainer" containerID="a4c8dad9a7ef27f2a3665157f06c09e88512ae2b08f48f7f52b6eea1574609ab" Mar 20 00:13:44 crc kubenswrapper[4867]: E0320 00:13:44.701573 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4c8dad9a7ef27f2a3665157f06c09e88512ae2b08f48f7f52b6eea1574609ab\": container with ID starting with a4c8dad9a7ef27f2a3665157f06c09e88512ae2b08f48f7f52b6eea1574609ab not found: ID does not exist" containerID="a4c8dad9a7ef27f2a3665157f06c09e88512ae2b08f48f7f52b6eea1574609ab" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.701600 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4c8dad9a7ef27f2a3665157f06c09e88512ae2b08f48f7f52b6eea1574609ab"} err="failed to get container status \"a4c8dad9a7ef27f2a3665157f06c09e88512ae2b08f48f7f52b6eea1574609ab\": rpc error: code = NotFound desc = could not find container \"a4c8dad9a7ef27f2a3665157f06c09e88512ae2b08f48f7f52b6eea1574609ab\": container with ID starting with a4c8dad9a7ef27f2a3665157f06c09e88512ae2b08f48f7f52b6eea1574609ab not found: ID does not exist" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.701613 4867 scope.go:117] "RemoveContainer" containerID="9b98619835a00f8b5a6d242d80093903415f3b604374fee52c2112448554f1c5" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.717394 4867 scope.go:117] "RemoveContainer" containerID="f948beae7519ff5114e68af54bf173886c150d0197e87dc6a03f8732c4da109f" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.721124 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w9fsp"] Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.736338 4867 scope.go:117] "RemoveContainer" containerID="ec1044df2b6ac042c2fff456d04abeb4e2136a155d535fc91210fcca6ef9c5f8" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.756563 4867 scope.go:117] "RemoveContainer" containerID="9b98619835a00f8b5a6d242d80093903415f3b604374fee52c2112448554f1c5" Mar 20 00:13:44 crc kubenswrapper[4867]: E0320 00:13:44.757116 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b98619835a00f8b5a6d242d80093903415f3b604374fee52c2112448554f1c5\": container with ID starting with 9b98619835a00f8b5a6d242d80093903415f3b604374fee52c2112448554f1c5 not found: ID does not exist" containerID="9b98619835a00f8b5a6d242d80093903415f3b604374fee52c2112448554f1c5" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.757152 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b98619835a00f8b5a6d242d80093903415f3b604374fee52c2112448554f1c5"} err="failed to get container status \"9b98619835a00f8b5a6d242d80093903415f3b604374fee52c2112448554f1c5\": rpc error: code = NotFound desc = could not find container \"9b98619835a00f8b5a6d242d80093903415f3b604374fee52c2112448554f1c5\": container with ID starting with 9b98619835a00f8b5a6d242d80093903415f3b604374fee52c2112448554f1c5 not found: ID does not exist" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.757178 4867 scope.go:117] "RemoveContainer" containerID="f948beae7519ff5114e68af54bf173886c150d0197e87dc6a03f8732c4da109f" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.760546 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b08fdffb-6c98-4f30-b70b-1592c40e01dc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:13:44 crc kubenswrapper[4867]: E0320 00:13:44.761451 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f948beae7519ff5114e68af54bf173886c150d0197e87dc6a03f8732c4da109f\": container with ID starting with f948beae7519ff5114e68af54bf173886c150d0197e87dc6a03f8732c4da109f not found: ID does not exist" containerID="f948beae7519ff5114e68af54bf173886c150d0197e87dc6a03f8732c4da109f" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.761546 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f948beae7519ff5114e68af54bf173886c150d0197e87dc6a03f8732c4da109f"} err="failed to get container status \"f948beae7519ff5114e68af54bf173886c150d0197e87dc6a03f8732c4da109f\": rpc error: code = NotFound desc = could not find container \"f948beae7519ff5114e68af54bf173886c150d0197e87dc6a03f8732c4da109f\": container with ID starting with f948beae7519ff5114e68af54bf173886c150d0197e87dc6a03f8732c4da109f not found: ID does not exist" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.761576 4867 scope.go:117] "RemoveContainer" containerID="ec1044df2b6ac042c2fff456d04abeb4e2136a155d535fc91210fcca6ef9c5f8" Mar 20 00:13:44 crc kubenswrapper[4867]: E0320 00:13:44.762706 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec1044df2b6ac042c2fff456d04abeb4e2136a155d535fc91210fcca6ef9c5f8\": container with ID starting with ec1044df2b6ac042c2fff456d04abeb4e2136a155d535fc91210fcca6ef9c5f8 not found: ID does not exist" containerID="ec1044df2b6ac042c2fff456d04abeb4e2136a155d535fc91210fcca6ef9c5f8" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.762758 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec1044df2b6ac042c2fff456d04abeb4e2136a155d535fc91210fcca6ef9c5f8"} err="failed to get container status \"ec1044df2b6ac042c2fff456d04abeb4e2136a155d535fc91210fcca6ef9c5f8\": rpc error: code = NotFound desc = could not find container \"ec1044df2b6ac042c2fff456d04abeb4e2136a155d535fc91210fcca6ef9c5f8\": container with ID starting with ec1044df2b6ac042c2fff456d04abeb4e2136a155d535fc91210fcca6ef9c5f8 not found: ID does not exist" Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.771102 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjkcm"] Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.776959 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjkcm"] Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.801581 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w9sbs"] Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.807152 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w9sbs"] Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.844123 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mkf6s"] Mar 20 00:13:44 crc kubenswrapper[4867]: I0320 00:13:44.847343 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mkf6s"] Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.290886 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rfsss"] Mar 20 00:13:45 crc kubenswrapper[4867]: E0320 00:13:45.291424 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9" containerName="extract-utilities" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291439 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9" containerName="extract-utilities" Mar 20 00:13:45 crc kubenswrapper[4867]: E0320 00:13:45.291452 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9" containerName="registry-server" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291460 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9" containerName="registry-server" Mar 20 00:13:45 crc kubenswrapper[4867]: E0320 00:13:45.291469 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61384875-b5b9-4757-839c-2071e973510c" containerName="extract-utilities" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291478 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="61384875-b5b9-4757-839c-2071e973510c" containerName="extract-utilities" Mar 20 00:13:45 crc kubenswrapper[4867]: E0320 00:13:45.291494 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c1b369-ec0e-43d0-a3ca-8c2b7b74d337" containerName="registry-server" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291524 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c1b369-ec0e-43d0-a3ca-8c2b7b74d337" containerName="registry-server" Mar 20 00:13:45 crc kubenswrapper[4867]: E0320 00:13:45.291541 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c1b369-ec0e-43d0-a3ca-8c2b7b74d337" containerName="extract-content" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291549 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c1b369-ec0e-43d0-a3ca-8c2b7b74d337" containerName="extract-content" Mar 20 00:13:45 crc kubenswrapper[4867]: E0320 00:13:45.291562 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61384875-b5b9-4757-839c-2071e973510c" containerName="extract-content" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291570 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="61384875-b5b9-4757-839c-2071e973510c" containerName="extract-content" Mar 20 00:13:45 crc kubenswrapper[4867]: E0320 00:13:45.291580 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08fdffb-6c98-4f30-b70b-1592c40e01dc" containerName="registry-server" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291588 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08fdffb-6c98-4f30-b70b-1592c40e01dc" containerName="registry-server" Mar 20 00:13:45 crc kubenswrapper[4867]: E0320 00:13:45.291598 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9" containerName="extract-content" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291605 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9" containerName="extract-content" Mar 20 00:13:45 crc kubenswrapper[4867]: E0320 00:13:45.291622 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce" containerName="marketplace-operator" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291630 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce" containerName="marketplace-operator" Mar 20 00:13:45 crc kubenswrapper[4867]: E0320 00:13:45.291642 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce" containerName="marketplace-operator" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291650 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce" containerName="marketplace-operator" Mar 20 00:13:45 crc kubenswrapper[4867]: E0320 00:13:45.291664 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08fdffb-6c98-4f30-b70b-1592c40e01dc" containerName="extract-utilities" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291672 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08fdffb-6c98-4f30-b70b-1592c40e01dc" containerName="extract-utilities" Mar 20 00:13:45 crc kubenswrapper[4867]: E0320 00:13:45.291683 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61384875-b5b9-4757-839c-2071e973510c" containerName="registry-server" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291691 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="61384875-b5b9-4757-839c-2071e973510c" containerName="registry-server" Mar 20 00:13:45 crc kubenswrapper[4867]: E0320 00:13:45.291701 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c1b369-ec0e-43d0-a3ca-8c2b7b74d337" containerName="extract-utilities" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291709 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c1b369-ec0e-43d0-a3ca-8c2b7b74d337" containerName="extract-utilities" Mar 20 00:13:45 crc kubenswrapper[4867]: E0320 00:13:45.291720 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08fdffb-6c98-4f30-b70b-1592c40e01dc" containerName="extract-content" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291728 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08fdffb-6c98-4f30-b70b-1592c40e01dc" containerName="extract-content" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291842 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce" containerName="marketplace-operator" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291859 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="58c1b369-ec0e-43d0-a3ca-8c2b7b74d337" containerName="registry-server" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291868 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="61384875-b5b9-4757-839c-2071e973510c" containerName="registry-server" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291881 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b08fdffb-6c98-4f30-b70b-1592c40e01dc" containerName="registry-server" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.291893 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9" containerName="registry-server" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.292114 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce" containerName="marketplace-operator" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.292791 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rfsss" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.299033 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rfsss"] Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.299757 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.367355 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd55f5a-599d-4769-8ddc-7983b0042236-catalog-content\") pod \"redhat-marketplace-rfsss\" (UID: \"9dd55f5a-599d-4769-8ddc-7983b0042236\") " pod="openshift-marketplace/redhat-marketplace-rfsss" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.367411 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd55f5a-599d-4769-8ddc-7983b0042236-utilities\") pod \"redhat-marketplace-rfsss\" (UID: \"9dd55f5a-599d-4769-8ddc-7983b0042236\") " pod="openshift-marketplace/redhat-marketplace-rfsss" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.367446 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22xph\" (UniqueName: \"kubernetes.io/projected/9dd55f5a-599d-4769-8ddc-7983b0042236-kube-api-access-22xph\") pod \"redhat-marketplace-rfsss\" (UID: \"9dd55f5a-599d-4769-8ddc-7983b0042236\") " pod="openshift-marketplace/redhat-marketplace-rfsss" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.468798 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd55f5a-599d-4769-8ddc-7983b0042236-utilities\") pod \"redhat-marketplace-rfsss\" (UID: \"9dd55f5a-599d-4769-8ddc-7983b0042236\") " pod="openshift-marketplace/redhat-marketplace-rfsss" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.468871 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22xph\" (UniqueName: \"kubernetes.io/projected/9dd55f5a-599d-4769-8ddc-7983b0042236-kube-api-access-22xph\") pod \"redhat-marketplace-rfsss\" (UID: \"9dd55f5a-599d-4769-8ddc-7983b0042236\") " pod="openshift-marketplace/redhat-marketplace-rfsss" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.468968 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd55f5a-599d-4769-8ddc-7983b0042236-catalog-content\") pod \"redhat-marketplace-rfsss\" (UID: \"9dd55f5a-599d-4769-8ddc-7983b0042236\") " pod="openshift-marketplace/redhat-marketplace-rfsss" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.469427 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd55f5a-599d-4769-8ddc-7983b0042236-utilities\") pod \"redhat-marketplace-rfsss\" (UID: \"9dd55f5a-599d-4769-8ddc-7983b0042236\") " pod="openshift-marketplace/redhat-marketplace-rfsss" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.469434 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd55f5a-599d-4769-8ddc-7983b0042236-catalog-content\") pod \"redhat-marketplace-rfsss\" (UID: \"9dd55f5a-599d-4769-8ddc-7983b0042236\") " pod="openshift-marketplace/redhat-marketplace-rfsss" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.493477 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7xpgd"] Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.494668 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7xpgd" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.497277 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.497533 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22xph\" (UniqueName: \"kubernetes.io/projected/9dd55f5a-599d-4769-8ddc-7983b0042236-kube-api-access-22xph\") pod \"redhat-marketplace-rfsss\" (UID: \"9dd55f5a-599d-4769-8ddc-7983b0042236\") " pod="openshift-marketplace/redhat-marketplace-rfsss" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.500095 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w9fsp" event={"ID":"d521d536-d270-40aa-9c6e-e80b679d1ecd","Type":"ContainerStarted","Data":"176bbc1a623c1b0e203badf61bd106cf2db9f1955d705d801455c436d5a80728"} Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.500127 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w9fsp" event={"ID":"d521d536-d270-40aa-9c6e-e80b679d1ecd","Type":"ContainerStarted","Data":"19b171d4d507adff8dfde390036cfde67e294bdbef94a87eef6af05bc57842e2"} Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.503361 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-w9fsp" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.505692 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fsj5n" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.516881 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-w9fsp" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.520852 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7xpgd"] Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.562703 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-w9fsp" podStartSLOduration=2.5626813410000002 podStartE2EDuration="2.562681341s" podCreationTimestamp="2026-03-20 00:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:13:45.552385571 +0000 UTC m=+439.778923098" watchObservedRunningTime="2026-03-20 00:13:45.562681341 +0000 UTC m=+439.789218858" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.570423 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7qw8\" (UniqueName: \"kubernetes.io/projected/accffd80-6502-4252-b9c1-5c6901af4739-kube-api-access-v7qw8\") pod \"redhat-operators-7xpgd\" (UID: \"accffd80-6502-4252-b9c1-5c6901af4739\") " pod="openshift-marketplace/redhat-operators-7xpgd" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.570562 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/accffd80-6502-4252-b9c1-5c6901af4739-utilities\") pod \"redhat-operators-7xpgd\" (UID: \"accffd80-6502-4252-b9c1-5c6901af4739\") " pod="openshift-marketplace/redhat-operators-7xpgd" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.570628 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/accffd80-6502-4252-b9c1-5c6901af4739-catalog-content\") pod \"redhat-operators-7xpgd\" (UID: \"accffd80-6502-4252-b9c1-5c6901af4739\") " pod="openshift-marketplace/redhat-operators-7xpgd" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.585562 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fsj5n"] Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.589695 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fsj5n"] Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.615344 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rfsss" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.671455 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7qw8\" (UniqueName: \"kubernetes.io/projected/accffd80-6502-4252-b9c1-5c6901af4739-kube-api-access-v7qw8\") pod \"redhat-operators-7xpgd\" (UID: \"accffd80-6502-4252-b9c1-5c6901af4739\") " pod="openshift-marketplace/redhat-operators-7xpgd" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.671529 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/accffd80-6502-4252-b9c1-5c6901af4739-utilities\") pod \"redhat-operators-7xpgd\" (UID: \"accffd80-6502-4252-b9c1-5c6901af4739\") " pod="openshift-marketplace/redhat-operators-7xpgd" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.671556 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/accffd80-6502-4252-b9c1-5c6901af4739-catalog-content\") pod \"redhat-operators-7xpgd\" (UID: \"accffd80-6502-4252-b9c1-5c6901af4739\") " pod="openshift-marketplace/redhat-operators-7xpgd" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.671941 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/accffd80-6502-4252-b9c1-5c6901af4739-catalog-content\") pod \"redhat-operators-7xpgd\" (UID: \"accffd80-6502-4252-b9c1-5c6901af4739\") " pod="openshift-marketplace/redhat-operators-7xpgd" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.671984 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/accffd80-6502-4252-b9c1-5c6901af4739-utilities\") pod \"redhat-operators-7xpgd\" (UID: \"accffd80-6502-4252-b9c1-5c6901af4739\") " pod="openshift-marketplace/redhat-operators-7xpgd" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.696034 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7qw8\" (UniqueName: \"kubernetes.io/projected/accffd80-6502-4252-b9c1-5c6901af4739-kube-api-access-v7qw8\") pod \"redhat-operators-7xpgd\" (UID: \"accffd80-6502-4252-b9c1-5c6901af4739\") " pod="openshift-marketplace/redhat-operators-7xpgd" Mar 20 00:13:45 crc kubenswrapper[4867]: I0320 00:13:45.841437 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7xpgd" Mar 20 00:13:46 crc kubenswrapper[4867]: I0320 00:13:46.012034 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rfsss"] Mar 20 00:13:46 crc kubenswrapper[4867]: W0320 00:13:46.018307 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dd55f5a_599d_4769_8ddc_7983b0042236.slice/crio-9f97313a2f7c7ec55d7f58531ab832e0beab90a348db0a5a3d6893dd09828ed9 WatchSource:0}: Error finding container 9f97313a2f7c7ec55d7f58531ab832e0beab90a348db0a5a3d6893dd09828ed9: Status 404 returned error can't find the container with id 9f97313a2f7c7ec55d7f58531ab832e0beab90a348db0a5a3d6893dd09828ed9 Mar 20 00:13:46 crc kubenswrapper[4867]: I0320 00:13:46.232409 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7xpgd"] Mar 20 00:13:46 crc kubenswrapper[4867]: W0320 00:13:46.237360 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaccffd80_6502_4252_b9c1_5c6901af4739.slice/crio-cff701f77e0a880fadd5acbc329935d9c3cf5e3e286d96b79eb8e2c330336175 WatchSource:0}: Error finding container cff701f77e0a880fadd5acbc329935d9c3cf5e3e286d96b79eb8e2c330336175: Status 404 returned error can't find the container with id cff701f77e0a880fadd5acbc329935d9c3cf5e3e286d96b79eb8e2c330336175 Mar 20 00:13:46 crc kubenswrapper[4867]: I0320 00:13:46.431629 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58c1b369-ec0e-43d0-a3ca-8c2b7b74d337" path="/var/lib/kubelet/pods/58c1b369-ec0e-43d0-a3ca-8c2b7b74d337/volumes" Mar 20 00:13:46 crc kubenswrapper[4867]: I0320 00:13:46.432660 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61384875-b5b9-4757-839c-2071e973510c" path="/var/lib/kubelet/pods/61384875-b5b9-4757-839c-2071e973510c/volumes" Mar 20 00:13:46 crc kubenswrapper[4867]: I0320 00:13:46.433182 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9" path="/var/lib/kubelet/pods/8b0ef29e-a6e3-486a-b6b0-ad7c2d72a5a9/volumes" Mar 20 00:13:46 crc kubenswrapper[4867]: I0320 00:13:46.434476 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce" path="/var/lib/kubelet/pods/a5fc6d4e-37ff-406f-9de6-cc2a6cba39ce/volumes" Mar 20 00:13:46 crc kubenswrapper[4867]: I0320 00:13:46.435103 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b08fdffb-6c98-4f30-b70b-1592c40e01dc" path="/var/lib/kubelet/pods/b08fdffb-6c98-4f30-b70b-1592c40e01dc/volumes" Mar 20 00:13:46 crc kubenswrapper[4867]: I0320 00:13:46.516603 4867 generic.go:334] "Generic (PLEG): container finished" podID="accffd80-6502-4252-b9c1-5c6901af4739" containerID="70733701e12113c269746e19ff588f68e063bd8b79efff7fdef5d1555a8965c4" exitCode=0 Mar 20 00:13:46 crc kubenswrapper[4867]: I0320 00:13:46.516704 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xpgd" event={"ID":"accffd80-6502-4252-b9c1-5c6901af4739","Type":"ContainerDied","Data":"70733701e12113c269746e19ff588f68e063bd8b79efff7fdef5d1555a8965c4"} Mar 20 00:13:46 crc kubenswrapper[4867]: I0320 00:13:46.516752 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xpgd" event={"ID":"accffd80-6502-4252-b9c1-5c6901af4739","Type":"ContainerStarted","Data":"cff701f77e0a880fadd5acbc329935d9c3cf5e3e286d96b79eb8e2c330336175"} Mar 20 00:13:46 crc kubenswrapper[4867]: I0320 00:13:46.518597 4867 generic.go:334] "Generic (PLEG): container finished" podID="9dd55f5a-599d-4769-8ddc-7983b0042236" containerID="250d5d1db3fd67e4316334db57b6dff2699b3e4ccb51cbd567ae6d03e498b8cd" exitCode=0 Mar 20 00:13:46 crc kubenswrapper[4867]: I0320 00:13:46.518689 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rfsss" event={"ID":"9dd55f5a-599d-4769-8ddc-7983b0042236","Type":"ContainerDied","Data":"250d5d1db3fd67e4316334db57b6dff2699b3e4ccb51cbd567ae6d03e498b8cd"} Mar 20 00:13:46 crc kubenswrapper[4867]: I0320 00:13:46.518723 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rfsss" event={"ID":"9dd55f5a-599d-4769-8ddc-7983b0042236","Type":"ContainerStarted","Data":"9f97313a2f7c7ec55d7f58531ab832e0beab90a348db0a5a3d6893dd09828ed9"} Mar 20 00:13:47 crc kubenswrapper[4867]: I0320 00:13:47.091909 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zdhfk"] Mar 20 00:13:47 crc kubenswrapper[4867]: I0320 00:13:47.093038 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zdhfk" Mar 20 00:13:47 crc kubenswrapper[4867]: I0320 00:13:47.094807 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 00:13:47 crc kubenswrapper[4867]: I0320 00:13:47.107024 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zdhfk"] Mar 20 00:13:47 crc kubenswrapper[4867]: I0320 00:13:47.188136 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6ba182-4e03-41a5-91fc-89a5528b1d64-utilities\") pod \"certified-operators-zdhfk\" (UID: \"de6ba182-4e03-41a5-91fc-89a5528b1d64\") " pod="openshift-marketplace/certified-operators-zdhfk" Mar 20 00:13:47 crc kubenswrapper[4867]: I0320 00:13:47.188245 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6ba182-4e03-41a5-91fc-89a5528b1d64-catalog-content\") pod \"certified-operators-zdhfk\" (UID: \"de6ba182-4e03-41a5-91fc-89a5528b1d64\") " pod="openshift-marketplace/certified-operators-zdhfk" Mar 20 00:13:47 crc kubenswrapper[4867]: I0320 00:13:47.188288 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85hpb\" (UniqueName: \"kubernetes.io/projected/de6ba182-4e03-41a5-91fc-89a5528b1d64-kube-api-access-85hpb\") pod \"certified-operators-zdhfk\" (UID: \"de6ba182-4e03-41a5-91fc-89a5528b1d64\") " pod="openshift-marketplace/certified-operators-zdhfk" Mar 20 00:13:47 crc kubenswrapper[4867]: I0320 00:13:47.289077 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85hpb\" (UniqueName: \"kubernetes.io/projected/de6ba182-4e03-41a5-91fc-89a5528b1d64-kube-api-access-85hpb\") pod \"certified-operators-zdhfk\" (UID: \"de6ba182-4e03-41a5-91fc-89a5528b1d64\") " pod="openshift-marketplace/certified-operators-zdhfk" Mar 20 00:13:47 crc kubenswrapper[4867]: I0320 00:13:47.289133 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6ba182-4e03-41a5-91fc-89a5528b1d64-utilities\") pod \"certified-operators-zdhfk\" (UID: \"de6ba182-4e03-41a5-91fc-89a5528b1d64\") " pod="openshift-marketplace/certified-operators-zdhfk" Mar 20 00:13:47 crc kubenswrapper[4867]: I0320 00:13:47.289188 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6ba182-4e03-41a5-91fc-89a5528b1d64-catalog-content\") pod \"certified-operators-zdhfk\" (UID: \"de6ba182-4e03-41a5-91fc-89a5528b1d64\") " pod="openshift-marketplace/certified-operators-zdhfk" Mar 20 00:13:47 crc kubenswrapper[4867]: I0320 00:13:47.289606 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6ba182-4e03-41a5-91fc-89a5528b1d64-catalog-content\") pod \"certified-operators-zdhfk\" (UID: \"de6ba182-4e03-41a5-91fc-89a5528b1d64\") " pod="openshift-marketplace/certified-operators-zdhfk" Mar 20 00:13:47 crc kubenswrapper[4867]: I0320 00:13:47.289904 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6ba182-4e03-41a5-91fc-89a5528b1d64-utilities\") pod \"certified-operators-zdhfk\" (UID: \"de6ba182-4e03-41a5-91fc-89a5528b1d64\") " pod="openshift-marketplace/certified-operators-zdhfk" Mar 20 00:13:47 crc kubenswrapper[4867]: I0320 00:13:47.307589 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85hpb\" (UniqueName: \"kubernetes.io/projected/de6ba182-4e03-41a5-91fc-89a5528b1d64-kube-api-access-85hpb\") pod \"certified-operators-zdhfk\" (UID: \"de6ba182-4e03-41a5-91fc-89a5528b1d64\") " pod="openshift-marketplace/certified-operators-zdhfk" Mar 20 00:13:47 crc kubenswrapper[4867]: I0320 00:13:47.414174 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zdhfk" Mar 20 00:13:47 crc kubenswrapper[4867]: I0320 00:13:47.826978 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zdhfk"] Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.096036 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ng8hp"] Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.097544 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ng8hp" Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.099445 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.106695 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ng8hp"] Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.203002 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/339c89c2-adce-4198-aa16-c34cc8e8176a-catalog-content\") pod \"community-operators-ng8hp\" (UID: \"339c89c2-adce-4198-aa16-c34cc8e8176a\") " pod="openshift-marketplace/community-operators-ng8hp" Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.203127 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/339c89c2-adce-4198-aa16-c34cc8e8176a-utilities\") pod \"community-operators-ng8hp\" (UID: \"339c89c2-adce-4198-aa16-c34cc8e8176a\") " pod="openshift-marketplace/community-operators-ng8hp" Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.203176 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8c8l\" (UniqueName: \"kubernetes.io/projected/339c89c2-adce-4198-aa16-c34cc8e8176a-kube-api-access-f8c8l\") pod \"community-operators-ng8hp\" (UID: \"339c89c2-adce-4198-aa16-c34cc8e8176a\") " pod="openshift-marketplace/community-operators-ng8hp" Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.304725 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/339c89c2-adce-4198-aa16-c34cc8e8176a-utilities\") pod \"community-operators-ng8hp\" (UID: \"339c89c2-adce-4198-aa16-c34cc8e8176a\") " pod="openshift-marketplace/community-operators-ng8hp" Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.304803 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8c8l\" (UniqueName: \"kubernetes.io/projected/339c89c2-adce-4198-aa16-c34cc8e8176a-kube-api-access-f8c8l\") pod \"community-operators-ng8hp\" (UID: \"339c89c2-adce-4198-aa16-c34cc8e8176a\") " pod="openshift-marketplace/community-operators-ng8hp" Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.304895 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/339c89c2-adce-4198-aa16-c34cc8e8176a-catalog-content\") pod \"community-operators-ng8hp\" (UID: \"339c89c2-adce-4198-aa16-c34cc8e8176a\") " pod="openshift-marketplace/community-operators-ng8hp" Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.305278 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/339c89c2-adce-4198-aa16-c34cc8e8176a-utilities\") pod \"community-operators-ng8hp\" (UID: \"339c89c2-adce-4198-aa16-c34cc8e8176a\") " pod="openshift-marketplace/community-operators-ng8hp" Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.305456 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/339c89c2-adce-4198-aa16-c34cc8e8176a-catalog-content\") pod \"community-operators-ng8hp\" (UID: \"339c89c2-adce-4198-aa16-c34cc8e8176a\") " pod="openshift-marketplace/community-operators-ng8hp" Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.326667 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8c8l\" (UniqueName: \"kubernetes.io/projected/339c89c2-adce-4198-aa16-c34cc8e8176a-kube-api-access-f8c8l\") pod \"community-operators-ng8hp\" (UID: \"339c89c2-adce-4198-aa16-c34cc8e8176a\") " pod="openshift-marketplace/community-operators-ng8hp" Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.420206 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ng8hp" Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.533444 4867 generic.go:334] "Generic (PLEG): container finished" podID="9dd55f5a-599d-4769-8ddc-7983b0042236" containerID="0d12d6f72c0b279e1494d8d388678f5f82a265a053ae067ce5bfad5bdaf4baa5" exitCode=0 Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.533531 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rfsss" event={"ID":"9dd55f5a-599d-4769-8ddc-7983b0042236","Type":"ContainerDied","Data":"0d12d6f72c0b279e1494d8d388678f5f82a265a053ae067ce5bfad5bdaf4baa5"} Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.535144 4867 generic.go:334] "Generic (PLEG): container finished" podID="de6ba182-4e03-41a5-91fc-89a5528b1d64" containerID="1cdbe577a7e2392da4189792f3ebfd2ebea22ce0a7c6054c8000fc83c6364274" exitCode=0 Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.535188 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zdhfk" event={"ID":"de6ba182-4e03-41a5-91fc-89a5528b1d64","Type":"ContainerDied","Data":"1cdbe577a7e2392da4189792f3ebfd2ebea22ce0a7c6054c8000fc83c6364274"} Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.535205 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zdhfk" event={"ID":"de6ba182-4e03-41a5-91fc-89a5528b1d64","Type":"ContainerStarted","Data":"30002a97e697ca9f4fda4ef102636dc682a3e72769e920102f5273cda4c5a08e"} Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.570238 4867 generic.go:334] "Generic (PLEG): container finished" podID="accffd80-6502-4252-b9c1-5c6901af4739" containerID="75cda22964f7c503ac6b67518af72bde1af9bb8ae0b7bf88cf52dce3649690a7" exitCode=0 Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.570284 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xpgd" event={"ID":"accffd80-6502-4252-b9c1-5c6901af4739","Type":"ContainerDied","Data":"75cda22964f7c503ac6b67518af72bde1af9bb8ae0b7bf88cf52dce3649690a7"} Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.613727 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ng8hp"] Mar 20 00:13:48 crc kubenswrapper[4867]: W0320 00:13:48.617698 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod339c89c2_adce_4198_aa16_c34cc8e8176a.slice/crio-4b491a847383228907579633cfaa24bb26023a224eeef4477ed33f894b718868 WatchSource:0}: Error finding container 4b491a847383228907579633cfaa24bb26023a224eeef4477ed33f894b718868: Status 404 returned error can't find the container with id 4b491a847383228907579633cfaa24bb26023a224eeef4477ed33f894b718868 Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.860323 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:13:48 crc kubenswrapper[4867]: I0320 00:13:48.860375 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:13:49 crc kubenswrapper[4867]: I0320 00:13:49.579592 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xpgd" event={"ID":"accffd80-6502-4252-b9c1-5c6901af4739","Type":"ContainerStarted","Data":"53a5c692ef082656b7e50be6599b7481be75e39b19869f196e2de814d900c2e1"} Mar 20 00:13:49 crc kubenswrapper[4867]: I0320 00:13:49.581461 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rfsss" event={"ID":"9dd55f5a-599d-4769-8ddc-7983b0042236","Type":"ContainerStarted","Data":"185c5ef02b6db043e3cde22599d4b402515167e14277f36115e5ebd2ef3e5c0e"} Mar 20 00:13:49 crc kubenswrapper[4867]: I0320 00:13:49.583925 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zdhfk" event={"ID":"de6ba182-4e03-41a5-91fc-89a5528b1d64","Type":"ContainerStarted","Data":"a3e7a644d9f820e0cf9a3d7c93f8f7b6d7be1450a060ee702a1f289aaf5b9541"} Mar 20 00:13:49 crc kubenswrapper[4867]: I0320 00:13:49.585279 4867 generic.go:334] "Generic (PLEG): container finished" podID="339c89c2-adce-4198-aa16-c34cc8e8176a" containerID="0ca242676bd54a2530cff4bca3ca8ea9043d070bce84269d0623ff52ee60154d" exitCode=0 Mar 20 00:13:49 crc kubenswrapper[4867]: I0320 00:13:49.585333 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ng8hp" event={"ID":"339c89c2-adce-4198-aa16-c34cc8e8176a","Type":"ContainerDied","Data":"0ca242676bd54a2530cff4bca3ca8ea9043d070bce84269d0623ff52ee60154d"} Mar 20 00:13:49 crc kubenswrapper[4867]: I0320 00:13:49.585368 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ng8hp" event={"ID":"339c89c2-adce-4198-aa16-c34cc8e8176a","Type":"ContainerStarted","Data":"4b491a847383228907579633cfaa24bb26023a224eeef4477ed33f894b718868"} Mar 20 00:13:49 crc kubenswrapper[4867]: I0320 00:13:49.608381 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7xpgd" podStartSLOduration=2.172980375 podStartE2EDuration="4.608363057s" podCreationTimestamp="2026-03-20 00:13:45 +0000 UTC" firstStartedPulling="2026-03-20 00:13:46.518276079 +0000 UTC m=+440.744813596" lastFinishedPulling="2026-03-20 00:13:48.953658761 +0000 UTC m=+443.180196278" observedRunningTime="2026-03-20 00:13:49.600851605 +0000 UTC m=+443.827389132" watchObservedRunningTime="2026-03-20 00:13:49.608363057 +0000 UTC m=+443.834900574" Mar 20 00:13:49 crc kubenswrapper[4867]: I0320 00:13:49.614879 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rfsss" podStartSLOduration=2.152883759 podStartE2EDuration="4.61486427s" podCreationTimestamp="2026-03-20 00:13:45 +0000 UTC" firstStartedPulling="2026-03-20 00:13:46.521302725 +0000 UTC m=+440.747840252" lastFinishedPulling="2026-03-20 00:13:48.983283246 +0000 UTC m=+443.209820763" observedRunningTime="2026-03-20 00:13:49.613628545 +0000 UTC m=+443.840166062" watchObservedRunningTime="2026-03-20 00:13:49.61486427 +0000 UTC m=+443.841401787" Mar 20 00:13:50 crc kubenswrapper[4867]: I0320 00:13:50.591479 4867 generic.go:334] "Generic (PLEG): container finished" podID="de6ba182-4e03-41a5-91fc-89a5528b1d64" containerID="a3e7a644d9f820e0cf9a3d7c93f8f7b6d7be1450a060ee702a1f289aaf5b9541" exitCode=0 Mar 20 00:13:50 crc kubenswrapper[4867]: I0320 00:13:50.591523 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zdhfk" event={"ID":"de6ba182-4e03-41a5-91fc-89a5528b1d64","Type":"ContainerDied","Data":"a3e7a644d9f820e0cf9a3d7c93f8f7b6d7be1450a060ee702a1f289aaf5b9541"} Mar 20 00:13:50 crc kubenswrapper[4867]: I0320 00:13:50.593209 4867 generic.go:334] "Generic (PLEG): container finished" podID="339c89c2-adce-4198-aa16-c34cc8e8176a" containerID="19c3be11b64049665e1d9ab317abeb20cc36f5511cfc59f2d51a8a184c8d13c4" exitCode=0 Mar 20 00:13:50 crc kubenswrapper[4867]: I0320 00:13:50.593232 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ng8hp" event={"ID":"339c89c2-adce-4198-aa16-c34cc8e8176a","Type":"ContainerDied","Data":"19c3be11b64049665e1d9ab317abeb20cc36f5511cfc59f2d51a8a184c8d13c4"} Mar 20 00:13:51 crc kubenswrapper[4867]: I0320 00:13:51.601077 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ng8hp" event={"ID":"339c89c2-adce-4198-aa16-c34cc8e8176a","Type":"ContainerStarted","Data":"1bb35d138fd9b4646bcf9b03ee4d89fef90a8bf9276b942415885c287d18afec"} Mar 20 00:13:51 crc kubenswrapper[4867]: I0320 00:13:51.604448 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zdhfk" event={"ID":"de6ba182-4e03-41a5-91fc-89a5528b1d64","Type":"ContainerStarted","Data":"b0dc4fb580ccc4191c5bec1887a486c9bf66d87a9c94addc03e00c86a179d559"} Mar 20 00:13:51 crc kubenswrapper[4867]: I0320 00:13:51.623744 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ng8hp" podStartSLOduration=1.910625317 podStartE2EDuration="3.623728528s" podCreationTimestamp="2026-03-20 00:13:48 +0000 UTC" firstStartedPulling="2026-03-20 00:13:49.586652855 +0000 UTC m=+443.813190372" lastFinishedPulling="2026-03-20 00:13:51.299756046 +0000 UTC m=+445.526293583" observedRunningTime="2026-03-20 00:13:51.622849854 +0000 UTC m=+445.849387381" watchObservedRunningTime="2026-03-20 00:13:51.623728528 +0000 UTC m=+445.850266055" Mar 20 00:13:51 crc kubenswrapper[4867]: I0320 00:13:51.647484 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zdhfk" podStartSLOduration=2.008222939 podStartE2EDuration="4.647468388s" podCreationTimestamp="2026-03-20 00:13:47 +0000 UTC" firstStartedPulling="2026-03-20 00:13:48.53680781 +0000 UTC m=+442.763345327" lastFinishedPulling="2026-03-20 00:13:51.176053259 +0000 UTC m=+445.402590776" observedRunningTime="2026-03-20 00:13:51.64648494 +0000 UTC m=+445.873022457" watchObservedRunningTime="2026-03-20 00:13:51.647468388 +0000 UTC m=+445.874005905" Mar 20 00:13:55 crc kubenswrapper[4867]: I0320 00:13:55.616439 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rfsss" Mar 20 00:13:55 crc kubenswrapper[4867]: I0320 00:13:55.616857 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rfsss" Mar 20 00:13:55 crc kubenswrapper[4867]: I0320 00:13:55.676416 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rfsss" Mar 20 00:13:55 crc kubenswrapper[4867]: I0320 00:13:55.732923 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rfsss" Mar 20 00:13:55 crc kubenswrapper[4867]: I0320 00:13:55.861415 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7xpgd" Mar 20 00:13:55 crc kubenswrapper[4867]: I0320 00:13:55.861483 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7xpgd" Mar 20 00:13:56 crc kubenswrapper[4867]: I0320 00:13:56.908655 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7xpgd" podUID="accffd80-6502-4252-b9c1-5c6901af4739" containerName="registry-server" probeResult="failure" output=< Mar 20 00:13:56 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Mar 20 00:13:56 crc kubenswrapper[4867]: > Mar 20 00:13:57 crc kubenswrapper[4867]: I0320 00:13:57.393991 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-c8vpm" Mar 20 00:13:57 crc kubenswrapper[4867]: I0320 00:13:57.415147 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zdhfk" Mar 20 00:13:57 crc kubenswrapper[4867]: I0320 00:13:57.415778 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zdhfk" Mar 20 00:13:57 crc kubenswrapper[4867]: I0320 00:13:57.483191 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7gtj4"] Mar 20 00:13:57 crc kubenswrapper[4867]: I0320 00:13:57.514750 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zdhfk" Mar 20 00:13:57 crc kubenswrapper[4867]: I0320 00:13:57.703317 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zdhfk" Mar 20 00:13:58 crc kubenswrapper[4867]: I0320 00:13:58.420987 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ng8hp" Mar 20 00:13:58 crc kubenswrapper[4867]: I0320 00:13:58.429416 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ng8hp" Mar 20 00:13:58 crc kubenswrapper[4867]: I0320 00:13:58.470854 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ng8hp" Mar 20 00:13:58 crc kubenswrapper[4867]: I0320 00:13:58.685211 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ng8hp" Mar 20 00:14:00 crc kubenswrapper[4867]: I0320 00:14:00.143348 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566094-dfkrv"] Mar 20 00:14:00 crc kubenswrapper[4867]: I0320 00:14:00.144253 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566094-dfkrv" Mar 20 00:14:00 crc kubenswrapper[4867]: I0320 00:14:00.145913 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 00:14:00 crc kubenswrapper[4867]: I0320 00:14:00.146274 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lxzm5" Mar 20 00:14:00 crc kubenswrapper[4867]: I0320 00:14:00.146316 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 00:14:00 crc kubenswrapper[4867]: I0320 00:14:00.184775 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566094-dfkrv"] Mar 20 00:14:00 crc kubenswrapper[4867]: I0320 00:14:00.272794 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7qqw\" (UniqueName: \"kubernetes.io/projected/2d8cf962-f825-4ef0-98d1-8648fc43f979-kube-api-access-c7qqw\") pod \"auto-csr-approver-29566094-dfkrv\" (UID: \"2d8cf962-f825-4ef0-98d1-8648fc43f979\") " pod="openshift-infra/auto-csr-approver-29566094-dfkrv" Mar 20 00:14:00 crc kubenswrapper[4867]: I0320 00:14:00.374539 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7qqw\" (UniqueName: \"kubernetes.io/projected/2d8cf962-f825-4ef0-98d1-8648fc43f979-kube-api-access-c7qqw\") pod \"auto-csr-approver-29566094-dfkrv\" (UID: \"2d8cf962-f825-4ef0-98d1-8648fc43f979\") " pod="openshift-infra/auto-csr-approver-29566094-dfkrv" Mar 20 00:14:00 crc kubenswrapper[4867]: I0320 00:14:00.394743 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7qqw\" (UniqueName: \"kubernetes.io/projected/2d8cf962-f825-4ef0-98d1-8648fc43f979-kube-api-access-c7qqw\") pod \"auto-csr-approver-29566094-dfkrv\" (UID: \"2d8cf962-f825-4ef0-98d1-8648fc43f979\") " pod="openshift-infra/auto-csr-approver-29566094-dfkrv" Mar 20 00:14:00 crc kubenswrapper[4867]: I0320 00:14:00.464645 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566094-dfkrv" Mar 20 00:14:00 crc kubenswrapper[4867]: I0320 00:14:00.951846 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566094-dfkrv"] Mar 20 00:14:01 crc kubenswrapper[4867]: I0320 00:14:01.656362 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566094-dfkrv" event={"ID":"2d8cf962-f825-4ef0-98d1-8648fc43f979","Type":"ContainerStarted","Data":"51fadb1dbb210e6df449607252f139c5001740944facca01078e2e45cae2ad67"} Mar 20 00:14:03 crc kubenswrapper[4867]: I0320 00:14:03.667748 4867 generic.go:334] "Generic (PLEG): container finished" podID="2d8cf962-f825-4ef0-98d1-8648fc43f979" containerID="d2c383227d0bb7ea2a5692407530a2a7e7be5abf41ca9501791a8dd92e9681e4" exitCode=0 Mar 20 00:14:03 crc kubenswrapper[4867]: I0320 00:14:03.668054 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566094-dfkrv" event={"ID":"2d8cf962-f825-4ef0-98d1-8648fc43f979","Type":"ContainerDied","Data":"d2c383227d0bb7ea2a5692407530a2a7e7be5abf41ca9501791a8dd92e9681e4"} Mar 20 00:14:04 crc kubenswrapper[4867]: I0320 00:14:04.998834 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566094-dfkrv" Mar 20 00:14:05 crc kubenswrapper[4867]: I0320 00:14:05.165119 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7qqw\" (UniqueName: \"kubernetes.io/projected/2d8cf962-f825-4ef0-98d1-8648fc43f979-kube-api-access-c7qqw\") pod \"2d8cf962-f825-4ef0-98d1-8648fc43f979\" (UID: \"2d8cf962-f825-4ef0-98d1-8648fc43f979\") " Mar 20 00:14:05 crc kubenswrapper[4867]: I0320 00:14:05.171674 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d8cf962-f825-4ef0-98d1-8648fc43f979-kube-api-access-c7qqw" (OuterVolumeSpecName: "kube-api-access-c7qqw") pod "2d8cf962-f825-4ef0-98d1-8648fc43f979" (UID: "2d8cf962-f825-4ef0-98d1-8648fc43f979"). InnerVolumeSpecName "kube-api-access-c7qqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:14:05 crc kubenswrapper[4867]: I0320 00:14:05.266538 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7qqw\" (UniqueName: \"kubernetes.io/projected/2d8cf962-f825-4ef0-98d1-8648fc43f979-kube-api-access-c7qqw\") on node \"crc\" DevicePath \"\"" Mar 20 00:14:05 crc kubenswrapper[4867]: I0320 00:14:05.684200 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566094-dfkrv" event={"ID":"2d8cf962-f825-4ef0-98d1-8648fc43f979","Type":"ContainerDied","Data":"51fadb1dbb210e6df449607252f139c5001740944facca01078e2e45cae2ad67"} Mar 20 00:14:05 crc kubenswrapper[4867]: I0320 00:14:05.684238 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566094-dfkrv" Mar 20 00:14:05 crc kubenswrapper[4867]: I0320 00:14:05.684260 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51fadb1dbb210e6df449607252f139c5001740944facca01078e2e45cae2ad67" Mar 20 00:14:05 crc kubenswrapper[4867]: I0320 00:14:05.893177 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7xpgd" Mar 20 00:14:05 crc kubenswrapper[4867]: I0320 00:14:05.967207 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7xpgd" Mar 20 00:14:06 crc kubenswrapper[4867]: I0320 00:14:06.091180 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566088-pbdgz"] Mar 20 00:14:06 crc kubenswrapper[4867]: I0320 00:14:06.097643 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566088-pbdgz"] Mar 20 00:14:06 crc kubenswrapper[4867]: I0320 00:14:06.435055 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e91db560-3692-4697-bbf0-3c5a8438d5e5" path="/var/lib/kubelet/pods/e91db560-3692-4697-bbf0-3c5a8438d5e5/volumes" Mar 20 00:14:18 crc kubenswrapper[4867]: I0320 00:14:18.860736 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:14:18 crc kubenswrapper[4867]: I0320 00:14:18.861347 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:14:22 crc kubenswrapper[4867]: I0320 00:14:22.542066 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" podUID="5df8ee0a-f645-4f19-b75c-9cd29e21be30" containerName="registry" containerID="cri-o://eb14b8164ea25462478959496409d0f650d0a4a7ffad1ac9e937014f607c00a2" gracePeriod=30 Mar 20 00:14:22 crc kubenswrapper[4867]: I0320 00:14:22.784749 4867 generic.go:334] "Generic (PLEG): container finished" podID="5df8ee0a-f645-4f19-b75c-9cd29e21be30" containerID="eb14b8164ea25462478959496409d0f650d0a4a7ffad1ac9e937014f607c00a2" exitCode=0 Mar 20 00:14:22 crc kubenswrapper[4867]: I0320 00:14:22.784826 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" event={"ID":"5df8ee0a-f645-4f19-b75c-9cd29e21be30","Type":"ContainerDied","Data":"eb14b8164ea25462478959496409d0f650d0a4a7ffad1ac9e937014f607c00a2"} Mar 20 00:14:22 crc kubenswrapper[4867]: I0320 00:14:22.884301 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:14:22 crc kubenswrapper[4867]: I0320 00:14:22.992792 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfxxf\" (UniqueName: \"kubernetes.io/projected/5df8ee0a-f645-4f19-b75c-9cd29e21be30-kube-api-access-nfxxf\") pod \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " Mar 20 00:14:22 crc kubenswrapper[4867]: I0320 00:14:22.992841 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5df8ee0a-f645-4f19-b75c-9cd29e21be30-registry-certificates\") pod \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " Mar 20 00:14:22 crc kubenswrapper[4867]: I0320 00:14:22.992866 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5df8ee0a-f645-4f19-b75c-9cd29e21be30-bound-sa-token\") pod \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " Mar 20 00:14:22 crc kubenswrapper[4867]: I0320 00:14:22.992890 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5df8ee0a-f645-4f19-b75c-9cd29e21be30-installation-pull-secrets\") pod \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " Mar 20 00:14:22 crc kubenswrapper[4867]: I0320 00:14:22.992919 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5df8ee0a-f645-4f19-b75c-9cd29e21be30-ca-trust-extracted\") pod \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " Mar 20 00:14:22 crc kubenswrapper[4867]: I0320 00:14:22.992939 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5df8ee0a-f645-4f19-b75c-9cd29e21be30-trusted-ca\") pod \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " Mar 20 00:14:22 crc kubenswrapper[4867]: I0320 00:14:22.993129 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " Mar 20 00:14:22 crc kubenswrapper[4867]: I0320 00:14:22.993194 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5df8ee0a-f645-4f19-b75c-9cd29e21be30-registry-tls\") pod \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\" (UID: \"5df8ee0a-f645-4f19-b75c-9cd29e21be30\") " Mar 20 00:14:22 crc kubenswrapper[4867]: I0320 00:14:22.994471 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5df8ee0a-f645-4f19-b75c-9cd29e21be30-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5df8ee0a-f645-4f19-b75c-9cd29e21be30" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:14:22 crc kubenswrapper[4867]: I0320 00:14:22.994840 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5df8ee0a-f645-4f19-b75c-9cd29e21be30-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5df8ee0a-f645-4f19-b75c-9cd29e21be30" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:14:22 crc kubenswrapper[4867]: I0320 00:14:22.998166 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df8ee0a-f645-4f19-b75c-9cd29e21be30-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5df8ee0a-f645-4f19-b75c-9cd29e21be30" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:14:22 crc kubenswrapper[4867]: I0320 00:14:22.998203 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df8ee0a-f645-4f19-b75c-9cd29e21be30-kube-api-access-nfxxf" (OuterVolumeSpecName: "kube-api-access-nfxxf") pod "5df8ee0a-f645-4f19-b75c-9cd29e21be30" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30"). InnerVolumeSpecName "kube-api-access-nfxxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:14:22 crc kubenswrapper[4867]: I0320 00:14:22.998562 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df8ee0a-f645-4f19-b75c-9cd29e21be30-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5df8ee0a-f645-4f19-b75c-9cd29e21be30" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:14:23 crc kubenswrapper[4867]: I0320 00:14:23.000570 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df8ee0a-f645-4f19-b75c-9cd29e21be30-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5df8ee0a-f645-4f19-b75c-9cd29e21be30" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:14:23 crc kubenswrapper[4867]: I0320 00:14:23.004175 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "5df8ee0a-f645-4f19-b75c-9cd29e21be30" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 00:14:23 crc kubenswrapper[4867]: I0320 00:14:23.009891 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5df8ee0a-f645-4f19-b75c-9cd29e21be30-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5df8ee0a-f645-4f19-b75c-9cd29e21be30" (UID: "5df8ee0a-f645-4f19-b75c-9cd29e21be30"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:14:23 crc kubenswrapper[4867]: I0320 00:14:23.094150 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfxxf\" (UniqueName: \"kubernetes.io/projected/5df8ee0a-f645-4f19-b75c-9cd29e21be30-kube-api-access-nfxxf\") on node \"crc\" DevicePath \"\"" Mar 20 00:14:23 crc kubenswrapper[4867]: I0320 00:14:23.094184 4867 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5df8ee0a-f645-4f19-b75c-9cd29e21be30-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 00:14:23 crc kubenswrapper[4867]: I0320 00:14:23.094195 4867 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5df8ee0a-f645-4f19-b75c-9cd29e21be30-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 00:14:23 crc kubenswrapper[4867]: I0320 00:14:23.094205 4867 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5df8ee0a-f645-4f19-b75c-9cd29e21be30-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 00:14:23 crc kubenswrapper[4867]: I0320 00:14:23.094215 4867 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5df8ee0a-f645-4f19-b75c-9cd29e21be30-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 00:14:23 crc kubenswrapper[4867]: I0320 00:14:23.094225 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5df8ee0a-f645-4f19-b75c-9cd29e21be30-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:14:23 crc kubenswrapper[4867]: I0320 00:14:23.094234 4867 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5df8ee0a-f645-4f19-b75c-9cd29e21be30-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 00:14:23 crc kubenswrapper[4867]: I0320 00:14:23.792529 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" event={"ID":"5df8ee0a-f645-4f19-b75c-9cd29e21be30","Type":"ContainerDied","Data":"7086d5ecefe4bbd71dfe58605d6960d1b88f04bd1e4e6ccfda82168a862e4ccd"} Mar 20 00:14:23 crc kubenswrapper[4867]: I0320 00:14:23.792595 4867 scope.go:117] "RemoveContainer" containerID="eb14b8164ea25462478959496409d0f650d0a4a7ffad1ac9e937014f607c00a2" Mar 20 00:14:23 crc kubenswrapper[4867]: I0320 00:14:23.792675 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7gtj4" Mar 20 00:14:23 crc kubenswrapper[4867]: I0320 00:14:23.829637 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7gtj4"] Mar 20 00:14:23 crc kubenswrapper[4867]: I0320 00:14:23.833882 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7gtj4"] Mar 20 00:14:24 crc kubenswrapper[4867]: I0320 00:14:24.429795 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5df8ee0a-f645-4f19-b75c-9cd29e21be30" path="/var/lib/kubelet/pods/5df8ee0a-f645-4f19-b75c-9cd29e21be30/volumes" Mar 20 00:14:48 crc kubenswrapper[4867]: I0320 00:14:48.860203 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:14:48 crc kubenswrapper[4867]: I0320 00:14:48.860908 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:14:48 crc kubenswrapper[4867]: I0320 00:14:48.860977 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:14:48 crc kubenswrapper[4867]: I0320 00:14:48.861801 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e816a18b900f4eb68c97f4f5041f2cb93edb33be685b6595c99943bb25a6e738"} pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 00:14:48 crc kubenswrapper[4867]: I0320 00:14:48.861898 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" containerID="cri-o://e816a18b900f4eb68c97f4f5041f2cb93edb33be685b6595c99943bb25a6e738" gracePeriod=600 Mar 20 00:14:49 crc kubenswrapper[4867]: I0320 00:14:49.966484 4867 generic.go:334] "Generic (PLEG): container finished" podID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerID="e816a18b900f4eb68c97f4f5041f2cb93edb33be685b6595c99943bb25a6e738" exitCode=0 Mar 20 00:14:49 crc kubenswrapper[4867]: I0320 00:14:49.966572 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" event={"ID":"00eacbd3-d921-414b-8b8d-c4298bdd5a28","Type":"ContainerDied","Data":"e816a18b900f4eb68c97f4f5041f2cb93edb33be685b6595c99943bb25a6e738"} Mar 20 00:14:49 crc kubenswrapper[4867]: I0320 00:14:49.967082 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" event={"ID":"00eacbd3-d921-414b-8b8d-c4298bdd5a28","Type":"ContainerStarted","Data":"4f460505b5fefef8711ce12da7e390bf20ce1928a1083fdde393f4aa4fca83e6"} Mar 20 00:14:49 crc kubenswrapper[4867]: I0320 00:14:49.967164 4867 scope.go:117] "RemoveContainer" containerID="d3d5c1dc31c5f8f0b51872b3b597f119729c198402ee491ca1f2d66a2bb0caee" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.144923 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7"] Mar 20 00:15:00 crc kubenswrapper[4867]: E0320 00:15:00.145820 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df8ee0a-f645-4f19-b75c-9cd29e21be30" containerName="registry" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.145836 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df8ee0a-f645-4f19-b75c-9cd29e21be30" containerName="registry" Mar 20 00:15:00 crc kubenswrapper[4867]: E0320 00:15:00.145857 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d8cf962-f825-4ef0-98d1-8648fc43f979" containerName="oc" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.145865 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d8cf962-f825-4ef0-98d1-8648fc43f979" containerName="oc" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.145980 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d8cf962-f825-4ef0-98d1-8648fc43f979" containerName="oc" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.145993 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df8ee0a-f645-4f19-b75c-9cd29e21be30" containerName="registry" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.146417 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.149859 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.150105 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.161051 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7"] Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.275553 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tc69\" (UniqueName: \"kubernetes.io/projected/ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc-kube-api-access-4tc69\") pod \"collect-profiles-29566095-cqpf7\" (UID: \"ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.276157 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc-config-volume\") pod \"collect-profiles-29566095-cqpf7\" (UID: \"ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.276304 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc-secret-volume\") pod \"collect-profiles-29566095-cqpf7\" (UID: \"ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.377984 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc-config-volume\") pod \"collect-profiles-29566095-cqpf7\" (UID: \"ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.378073 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc-secret-volume\") pod \"collect-profiles-29566095-cqpf7\" (UID: \"ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.378218 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tc69\" (UniqueName: \"kubernetes.io/projected/ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc-kube-api-access-4tc69\") pod \"collect-profiles-29566095-cqpf7\" (UID: \"ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.380645 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc-config-volume\") pod \"collect-profiles-29566095-cqpf7\" (UID: \"ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.389416 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc-secret-volume\") pod \"collect-profiles-29566095-cqpf7\" (UID: \"ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.407731 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tc69\" (UniqueName: \"kubernetes.io/projected/ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc-kube-api-access-4tc69\") pod \"collect-profiles-29566095-cqpf7\" (UID: \"ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.475647 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7" Mar 20 00:15:00 crc kubenswrapper[4867]: I0320 00:15:00.741415 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7"] Mar 20 00:15:01 crc kubenswrapper[4867]: I0320 00:15:01.044688 4867 generic.go:334] "Generic (PLEG): container finished" podID="ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc" containerID="f2d7b94f3f0b72088d60b372f15b9fde56ea1ef38aaf8b6990194375f70bcb22" exitCode=0 Mar 20 00:15:01 crc kubenswrapper[4867]: I0320 00:15:01.044794 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7" event={"ID":"ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc","Type":"ContainerDied","Data":"f2d7b94f3f0b72088d60b372f15b9fde56ea1ef38aaf8b6990194375f70bcb22"} Mar 20 00:15:01 crc kubenswrapper[4867]: I0320 00:15:01.045079 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7" event={"ID":"ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc","Type":"ContainerStarted","Data":"a9904b6269aba05d47cf8a2056325747b4b04638a08b9106d681b24019e36bbf"} Mar 20 00:15:02 crc kubenswrapper[4867]: I0320 00:15:02.322274 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7" Mar 20 00:15:02 crc kubenswrapper[4867]: I0320 00:15:02.505229 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc-secret-volume\") pod \"ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc\" (UID: \"ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc\") " Mar 20 00:15:02 crc kubenswrapper[4867]: I0320 00:15:02.505338 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tc69\" (UniqueName: \"kubernetes.io/projected/ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc-kube-api-access-4tc69\") pod \"ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc\" (UID: \"ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc\") " Mar 20 00:15:02 crc kubenswrapper[4867]: I0320 00:15:02.505482 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc-config-volume\") pod \"ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc\" (UID: \"ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc\") " Mar 20 00:15:02 crc kubenswrapper[4867]: I0320 00:15:02.506730 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc-config-volume" (OuterVolumeSpecName: "config-volume") pod "ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc" (UID: "ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:15:02 crc kubenswrapper[4867]: I0320 00:15:02.513592 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc" (UID: "ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:15:02 crc kubenswrapper[4867]: I0320 00:15:02.513800 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc-kube-api-access-4tc69" (OuterVolumeSpecName: "kube-api-access-4tc69") pod "ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc" (UID: "ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc"). InnerVolumeSpecName "kube-api-access-4tc69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:15:02 crc kubenswrapper[4867]: I0320 00:15:02.607830 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 00:15:02 crc kubenswrapper[4867]: I0320 00:15:02.607909 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 00:15:02 crc kubenswrapper[4867]: I0320 00:15:02.607936 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tc69\" (UniqueName: \"kubernetes.io/projected/ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc-kube-api-access-4tc69\") on node \"crc\" DevicePath \"\"" Mar 20 00:15:03 crc kubenswrapper[4867]: I0320 00:15:03.061567 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7" event={"ID":"ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc","Type":"ContainerDied","Data":"a9904b6269aba05d47cf8a2056325747b4b04638a08b9106d681b24019e36bbf"} Mar 20 00:15:03 crc kubenswrapper[4867]: I0320 00:15:03.061639 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9904b6269aba05d47cf8a2056325747b4b04638a08b9106d681b24019e36bbf" Mar 20 00:15:03 crc kubenswrapper[4867]: I0320 00:15:03.061736 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566095-cqpf7" Mar 20 00:16:00 crc kubenswrapper[4867]: I0320 00:16:00.130287 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566096-l8bkt"] Mar 20 00:16:00 crc kubenswrapper[4867]: E0320 00:16:00.130980 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc" containerName="collect-profiles" Mar 20 00:16:00 crc kubenswrapper[4867]: I0320 00:16:00.130991 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc" containerName="collect-profiles" Mar 20 00:16:00 crc kubenswrapper[4867]: I0320 00:16:00.131086 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2040b5-cf77-46d7-ab5a-7ba2461f2dfc" containerName="collect-profiles" Mar 20 00:16:00 crc kubenswrapper[4867]: I0320 00:16:00.131446 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566096-l8bkt" Mar 20 00:16:00 crc kubenswrapper[4867]: I0320 00:16:00.133774 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 00:16:00 crc kubenswrapper[4867]: I0320 00:16:00.133924 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lxzm5" Mar 20 00:16:00 crc kubenswrapper[4867]: I0320 00:16:00.134296 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 00:16:00 crc kubenswrapper[4867]: I0320 00:16:00.139272 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566096-l8bkt"] Mar 20 00:16:00 crc kubenswrapper[4867]: I0320 00:16:00.307714 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgbzk\" (UniqueName: \"kubernetes.io/projected/e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2-kube-api-access-zgbzk\") pod \"auto-csr-approver-29566096-l8bkt\" (UID: \"e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2\") " pod="openshift-infra/auto-csr-approver-29566096-l8bkt" Mar 20 00:16:00 crc kubenswrapper[4867]: I0320 00:16:00.409450 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgbzk\" (UniqueName: \"kubernetes.io/projected/e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2-kube-api-access-zgbzk\") pod \"auto-csr-approver-29566096-l8bkt\" (UID: \"e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2\") " pod="openshift-infra/auto-csr-approver-29566096-l8bkt" Mar 20 00:16:00 crc kubenswrapper[4867]: I0320 00:16:00.433105 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgbzk\" (UniqueName: \"kubernetes.io/projected/e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2-kube-api-access-zgbzk\") pod \"auto-csr-approver-29566096-l8bkt\" (UID: \"e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2\") " pod="openshift-infra/auto-csr-approver-29566096-l8bkt" Mar 20 00:16:00 crc kubenswrapper[4867]: I0320 00:16:00.447334 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566096-l8bkt" Mar 20 00:16:00 crc kubenswrapper[4867]: I0320 00:16:00.641365 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566096-l8bkt"] Mar 20 00:16:00 crc kubenswrapper[4867]: I0320 00:16:00.647653 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 00:16:01 crc kubenswrapper[4867]: I0320 00:16:01.432351 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566096-l8bkt" event={"ID":"e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2","Type":"ContainerStarted","Data":"d71ea641112bd99a09c1c02cb927c437fcdf7b32032232970fc2ca572c5cfb30"} Mar 20 00:16:02 crc kubenswrapper[4867]: I0320 00:16:02.439480 4867 generic.go:334] "Generic (PLEG): container finished" podID="e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2" containerID="87550ab71a28fecc4f6573b5137c2dbe64d336c6dae82aa97c920357e2634015" exitCode=0 Mar 20 00:16:02 crc kubenswrapper[4867]: I0320 00:16:02.439568 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566096-l8bkt" event={"ID":"e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2","Type":"ContainerDied","Data":"87550ab71a28fecc4f6573b5137c2dbe64d336c6dae82aa97c920357e2634015"} Mar 20 00:16:03 crc kubenswrapper[4867]: I0320 00:16:03.684919 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566096-l8bkt" Mar 20 00:16:03 crc kubenswrapper[4867]: I0320 00:16:03.853074 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgbzk\" (UniqueName: \"kubernetes.io/projected/e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2-kube-api-access-zgbzk\") pod \"e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2\" (UID: \"e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2\") " Mar 20 00:16:03 crc kubenswrapper[4867]: I0320 00:16:03.861781 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2-kube-api-access-zgbzk" (OuterVolumeSpecName: "kube-api-access-zgbzk") pod "e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2" (UID: "e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2"). InnerVolumeSpecName "kube-api-access-zgbzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:16:03 crc kubenswrapper[4867]: I0320 00:16:03.955532 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgbzk\" (UniqueName: \"kubernetes.io/projected/e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2-kube-api-access-zgbzk\") on node \"crc\" DevicePath \"\"" Mar 20 00:16:04 crc kubenswrapper[4867]: I0320 00:16:04.459163 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566096-l8bkt" Mar 20 00:16:04 crc kubenswrapper[4867]: I0320 00:16:04.459077 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566096-l8bkt" event={"ID":"e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2","Type":"ContainerDied","Data":"d71ea641112bd99a09c1c02cb927c437fcdf7b32032232970fc2ca572c5cfb30"} Mar 20 00:16:04 crc kubenswrapper[4867]: I0320 00:16:04.459917 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d71ea641112bd99a09c1c02cb927c437fcdf7b32032232970fc2ca572c5cfb30" Mar 20 00:16:04 crc kubenswrapper[4867]: I0320 00:16:04.744854 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566090-fczg6"] Mar 20 00:16:04 crc kubenswrapper[4867]: I0320 00:16:04.749248 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566090-fczg6"] Mar 20 00:16:06 crc kubenswrapper[4867]: I0320 00:16:06.432119 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4b1008c-a714-49a2-8c17-e971afc302af" path="/var/lib/kubelet/pods/a4b1008c-a714-49a2-8c17-e971afc302af/volumes" Mar 20 00:17:18 crc kubenswrapper[4867]: I0320 00:17:18.860555 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:17:18 crc kubenswrapper[4867]: I0320 00:17:18.861347 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:17:46 crc kubenswrapper[4867]: I0320 00:17:46.856587 4867 scope.go:117] "RemoveContainer" containerID="a7e3db854172fcfe7fd52c5129cfbbb027b3bb941031a8dd48e9af48f157292e" Mar 20 00:17:46 crc kubenswrapper[4867]: I0320 00:17:46.899295 4867 scope.go:117] "RemoveContainer" containerID="ded3131ae9568b714d2611623f59bf8d400b6e7d047b4867c92aa7440f7e7425" Mar 20 00:17:48 crc kubenswrapper[4867]: I0320 00:17:48.859729 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:17:48 crc kubenswrapper[4867]: I0320 00:17:48.860083 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:18:00 crc kubenswrapper[4867]: I0320 00:18:00.142826 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566098-66gdx"] Mar 20 00:18:00 crc kubenswrapper[4867]: E0320 00:18:00.143588 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2" containerName="oc" Mar 20 00:18:00 crc kubenswrapper[4867]: I0320 00:18:00.143600 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2" containerName="oc" Mar 20 00:18:00 crc kubenswrapper[4867]: I0320 00:18:00.143718 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2" containerName="oc" Mar 20 00:18:00 crc kubenswrapper[4867]: I0320 00:18:00.144186 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566098-66gdx" Mar 20 00:18:00 crc kubenswrapper[4867]: I0320 00:18:00.147235 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lxzm5" Mar 20 00:18:00 crc kubenswrapper[4867]: I0320 00:18:00.147430 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 00:18:00 crc kubenswrapper[4867]: I0320 00:18:00.147797 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 00:18:00 crc kubenswrapper[4867]: I0320 00:18:00.154588 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566098-66gdx"] Mar 20 00:18:00 crc kubenswrapper[4867]: I0320 00:18:00.223568 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qqkk\" (UniqueName: \"kubernetes.io/projected/26e3cfe9-a819-4d53-a513-376208ec2884-kube-api-access-5qqkk\") pod \"auto-csr-approver-29566098-66gdx\" (UID: \"26e3cfe9-a819-4d53-a513-376208ec2884\") " pod="openshift-infra/auto-csr-approver-29566098-66gdx" Mar 20 00:18:00 crc kubenswrapper[4867]: I0320 00:18:00.325663 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qqkk\" (UniqueName: \"kubernetes.io/projected/26e3cfe9-a819-4d53-a513-376208ec2884-kube-api-access-5qqkk\") pod \"auto-csr-approver-29566098-66gdx\" (UID: \"26e3cfe9-a819-4d53-a513-376208ec2884\") " pod="openshift-infra/auto-csr-approver-29566098-66gdx" Mar 20 00:18:00 crc kubenswrapper[4867]: I0320 00:18:00.352313 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qqkk\" (UniqueName: \"kubernetes.io/projected/26e3cfe9-a819-4d53-a513-376208ec2884-kube-api-access-5qqkk\") pod \"auto-csr-approver-29566098-66gdx\" (UID: \"26e3cfe9-a819-4d53-a513-376208ec2884\") " pod="openshift-infra/auto-csr-approver-29566098-66gdx" Mar 20 00:18:00 crc kubenswrapper[4867]: I0320 00:18:00.467857 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566098-66gdx" Mar 20 00:18:00 crc kubenswrapper[4867]: I0320 00:18:00.699893 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566098-66gdx"] Mar 20 00:18:01 crc kubenswrapper[4867]: I0320 00:18:01.442165 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566098-66gdx" event={"ID":"26e3cfe9-a819-4d53-a513-376208ec2884","Type":"ContainerStarted","Data":"e0d67226c93e8565d1fe6e7b2116310baea41f5b2a3bb849322085d03d53bc9e"} Mar 20 00:18:02 crc kubenswrapper[4867]: I0320 00:18:02.449117 4867 generic.go:334] "Generic (PLEG): container finished" podID="26e3cfe9-a819-4d53-a513-376208ec2884" containerID="ee398b267881cece77c96a5a97ca5b7c75da4388c86f20e9b04c8d5b29520d5e" exitCode=0 Mar 20 00:18:02 crc kubenswrapper[4867]: I0320 00:18:02.449193 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566098-66gdx" event={"ID":"26e3cfe9-a819-4d53-a513-376208ec2884","Type":"ContainerDied","Data":"ee398b267881cece77c96a5a97ca5b7c75da4388c86f20e9b04c8d5b29520d5e"} Mar 20 00:18:03 crc kubenswrapper[4867]: I0320 00:18:03.829305 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566098-66gdx" Mar 20 00:18:03 crc kubenswrapper[4867]: I0320 00:18:03.877315 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qqkk\" (UniqueName: \"kubernetes.io/projected/26e3cfe9-a819-4d53-a513-376208ec2884-kube-api-access-5qqkk\") pod \"26e3cfe9-a819-4d53-a513-376208ec2884\" (UID: \"26e3cfe9-a819-4d53-a513-376208ec2884\") " Mar 20 00:18:03 crc kubenswrapper[4867]: I0320 00:18:03.885704 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e3cfe9-a819-4d53-a513-376208ec2884-kube-api-access-5qqkk" (OuterVolumeSpecName: "kube-api-access-5qqkk") pod "26e3cfe9-a819-4d53-a513-376208ec2884" (UID: "26e3cfe9-a819-4d53-a513-376208ec2884"). InnerVolumeSpecName "kube-api-access-5qqkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:18:03 crc kubenswrapper[4867]: I0320 00:18:03.978959 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qqkk\" (UniqueName: \"kubernetes.io/projected/26e3cfe9-a819-4d53-a513-376208ec2884-kube-api-access-5qqkk\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:04 crc kubenswrapper[4867]: I0320 00:18:04.464983 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566098-66gdx" Mar 20 00:18:04 crc kubenswrapper[4867]: I0320 00:18:04.465096 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566098-66gdx" event={"ID":"26e3cfe9-a819-4d53-a513-376208ec2884","Type":"ContainerDied","Data":"e0d67226c93e8565d1fe6e7b2116310baea41f5b2a3bb849322085d03d53bc9e"} Mar 20 00:18:04 crc kubenswrapper[4867]: I0320 00:18:04.465155 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0d67226c93e8565d1fe6e7b2116310baea41f5b2a3bb849322085d03d53bc9e" Mar 20 00:18:04 crc kubenswrapper[4867]: I0320 00:18:04.903526 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566092-d4zmt"] Mar 20 00:18:04 crc kubenswrapper[4867]: I0320 00:18:04.910606 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566092-d4zmt"] Mar 20 00:18:06 crc kubenswrapper[4867]: I0320 00:18:06.431276 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="442e78ca-a2a5-4281-9fe0-e42e0d4e1df2" path="/var/lib/kubelet/pods/442e78ca-a2a5-4281-9fe0-e42e0d4e1df2/volumes" Mar 20 00:18:18 crc kubenswrapper[4867]: I0320 00:18:18.860245 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:18:18 crc kubenswrapper[4867]: I0320 00:18:18.860855 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:18:18 crc kubenswrapper[4867]: I0320 00:18:18.860906 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:18:18 crc kubenswrapper[4867]: I0320 00:18:18.861545 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4f460505b5fefef8711ce12da7e390bf20ce1928a1083fdde393f4aa4fca83e6"} pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 00:18:18 crc kubenswrapper[4867]: I0320 00:18:18.861605 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" containerID="cri-o://4f460505b5fefef8711ce12da7e390bf20ce1928a1083fdde393f4aa4fca83e6" gracePeriod=600 Mar 20 00:18:19 crc kubenswrapper[4867]: I0320 00:18:19.568564 4867 generic.go:334] "Generic (PLEG): container finished" podID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerID="4f460505b5fefef8711ce12da7e390bf20ce1928a1083fdde393f4aa4fca83e6" exitCode=0 Mar 20 00:18:19 crc kubenswrapper[4867]: I0320 00:18:19.568643 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" event={"ID":"00eacbd3-d921-414b-8b8d-c4298bdd5a28","Type":"ContainerDied","Data":"4f460505b5fefef8711ce12da7e390bf20ce1928a1083fdde393f4aa4fca83e6"} Mar 20 00:18:19 crc kubenswrapper[4867]: I0320 00:18:19.569005 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" event={"ID":"00eacbd3-d921-414b-8b8d-c4298bdd5a28","Type":"ContainerStarted","Data":"33d423cd234b4249f2b7bca2c14b518d37de21e54afaaa7c2a2edca5bfac00fe"} Mar 20 00:18:19 crc kubenswrapper[4867]: I0320 00:18:19.569042 4867 scope.go:117] "RemoveContainer" containerID="e816a18b900f4eb68c97f4f5041f2cb93edb33be685b6595c99943bb25a6e738" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.557094 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5zkft"] Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.558024 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovn-controller" containerID="cri-o://497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae" gracePeriod=30 Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.558133 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="kube-rbac-proxy-node" containerID="cri-o://23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e" gracePeriod=30 Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.558175 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovn-acl-logging" containerID="cri-o://079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd" gracePeriod=30 Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.558145 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="northd" containerID="cri-o://6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66" gracePeriod=30 Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.558124 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a" gracePeriod=30 Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.560725 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="sbdb" containerID="cri-o://208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88" gracePeriod=30 Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.560900 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="nbdb" containerID="cri-o://a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8" gracePeriod=30 Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.600531 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovnkube-controller" containerID="cri-o://68269af659165297c90238e7a18f8776fb268e39ddd21dbbae7f391687bb05ee" gracePeriod=30 Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.739219 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-98n2n_97e52c03-2ca5-4cad-8459-f03029234544/kube-multus/1.log" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.739744 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-98n2n_97e52c03-2ca5-4cad-8459-f03029234544/kube-multus/0.log" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.739784 4867 generic.go:334] "Generic (PLEG): container finished" podID="97e52c03-2ca5-4cad-8459-f03029234544" containerID="94ac4d4ff5d940e0e8e7fb098074879aa2d41ab5b5b0e51d81d50801b1ff8b3f" exitCode=2 Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.739819 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-98n2n" event={"ID":"97e52c03-2ca5-4cad-8459-f03029234544","Type":"ContainerDied","Data":"94ac4d4ff5d940e0e8e7fb098074879aa2d41ab5b5b0e51d81d50801b1ff8b3f"} Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.739872 4867 scope.go:117] "RemoveContainer" containerID="b18fe6248e8cbffbf324d331afdc321ae6119dc1403c72ec83b6b960d6a7f9ab" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.740343 4867 scope.go:117] "RemoveContainer" containerID="94ac4d4ff5d940e0e8e7fb098074879aa2d41ab5b5b0e51d81d50801b1ff8b3f" Mar 20 00:18:46 crc kubenswrapper[4867]: E0320 00:18:46.740560 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-98n2n_openshift-multus(97e52c03-2ca5-4cad-8459-f03029234544)\"" pod="openshift-multus/multus-98n2n" podUID="97e52c03-2ca5-4cad-8459-f03029234544" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.745275 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovnkube-controller/3.log" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.747545 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovn-acl-logging/0.log" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.748019 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovn-controller/0.log" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.748444 4867 generic.go:334] "Generic (PLEG): container finished" podID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerID="68269af659165297c90238e7a18f8776fb268e39ddd21dbbae7f391687bb05ee" exitCode=0 Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.748466 4867 generic.go:334] "Generic (PLEG): container finished" podID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerID="3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a" exitCode=0 Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.748475 4867 generic.go:334] "Generic (PLEG): container finished" podID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerID="23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e" exitCode=0 Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.748482 4867 generic.go:334] "Generic (PLEG): container finished" podID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerID="079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd" exitCode=143 Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.748506 4867 generic.go:334] "Generic (PLEG): container finished" podID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerID="497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae" exitCode=143 Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.748531 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerDied","Data":"68269af659165297c90238e7a18f8776fb268e39ddd21dbbae7f391687bb05ee"} Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.748561 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerDied","Data":"3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a"} Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.748574 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerDied","Data":"23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e"} Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.748586 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerDied","Data":"079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd"} Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.748595 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerDied","Data":"497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae"} Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.835206 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovnkube-controller/3.log" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.837007 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovn-acl-logging/0.log" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.837543 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5zkft_f1af2033-700e-4f63-939d-b7132a1e5b5f/ovn-controller/0.log" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.837978 4867 scope.go:117] "RemoveContainer" containerID="f732ddfa6a7d3675db4bf3c6edebbe853f5e8970e3ac5e6b7dafa97cbb4f3f94" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.838116 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.885566 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zglz9"] Mar 20 00:18:46 crc kubenswrapper[4867]: E0320 00:18:46.885754 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovn-acl-logging" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.885764 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovn-acl-logging" Mar 20 00:18:46 crc kubenswrapper[4867]: E0320 00:18:46.885775 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="kube-rbac-proxy-node" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.885780 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="kube-rbac-proxy-node" Mar 20 00:18:46 crc kubenswrapper[4867]: E0320 00:18:46.885789 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="sbdb" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.885795 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="sbdb" Mar 20 00:18:46 crc kubenswrapper[4867]: E0320 00:18:46.885802 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.885809 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 00:18:46 crc kubenswrapper[4867]: E0320 00:18:46.885817 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovnkube-controller" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.885823 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovnkube-controller" Mar 20 00:18:46 crc kubenswrapper[4867]: E0320 00:18:46.885831 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e3cfe9-a819-4d53-a513-376208ec2884" containerName="oc" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.885836 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e3cfe9-a819-4d53-a513-376208ec2884" containerName="oc" Mar 20 00:18:46 crc kubenswrapper[4867]: E0320 00:18:46.885845 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovnkube-controller" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.885851 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovnkube-controller" Mar 20 00:18:46 crc kubenswrapper[4867]: E0320 00:18:46.885857 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="kubecfg-setup" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.885864 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="kubecfg-setup" Mar 20 00:18:46 crc kubenswrapper[4867]: E0320 00:18:46.885870 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovnkube-controller" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.885876 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovnkube-controller" Mar 20 00:18:46 crc kubenswrapper[4867]: E0320 00:18:46.885883 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovnkube-controller" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.885889 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovnkube-controller" Mar 20 00:18:46 crc kubenswrapper[4867]: E0320 00:18:46.885897 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="northd" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.885902 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="northd" Mar 20 00:18:46 crc kubenswrapper[4867]: E0320 00:18:46.885908 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="nbdb" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.885913 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="nbdb" Mar 20 00:18:46 crc kubenswrapper[4867]: E0320 00:18:46.885923 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovn-controller" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.885928 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovn-controller" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.886012 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.886024 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="kube-rbac-proxy-node" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.886030 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="sbdb" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.886038 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovnkube-controller" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.886044 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovnkube-controller" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.886051 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovnkube-controller" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.886058 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="nbdb" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.886064 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e3cfe9-a819-4d53-a513-376208ec2884" containerName="oc" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.886072 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovn-acl-logging" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.886077 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="northd" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.886083 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovn-controller" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.886090 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovnkube-controller" Mar 20 00:18:46 crc kubenswrapper[4867]: E0320 00:18:46.886181 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovnkube-controller" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.886187 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovnkube-controller" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.886270 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" containerName="ovnkube-controller" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.887875 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.959690 4867 scope.go:117] "RemoveContainer" containerID="497131109cd1db23c801f067d72e2b9ed25bae5b9676dc549854dae61030b3ae" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.981900 4867 scope.go:117] "RemoveContainer" containerID="208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88" Mar 20 00:18:46 crc kubenswrapper[4867]: I0320 00:18:46.996419 4867 scope.go:117] "RemoveContainer" containerID="079008ee146c97f5a83c4594b06ac8ec8725aa63c17b0d1f0d9d303e63a700cd" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.010250 4867 scope.go:117] "RemoveContainer" containerID="23378a69f27315be09110ee7c8522358c0cfff32c91c73b11c66f5659fb4366e" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.021734 4867 scope.go:117] "RemoveContainer" containerID="d6ec8864937d04a1757bdacd9ca350fdde70b7431168b255ea3664369b55f6f1" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.037953 4867 scope.go:117] "RemoveContainer" containerID="a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.038707 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-run-systemd\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.038964 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-kubelet\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.039001 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.039401 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-node-log\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.039787 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-run-netns\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.039526 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-node-log" (OuterVolumeSpecName: "node-log") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.039975 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.040112 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.040323 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-cni-netd\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.040555 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-systemd-units\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.040741 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f1af2033-700e-4f63-939d-b7132a1e5b5f-ovn-node-metrics-cert\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.040942 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f1af2033-700e-4f63-939d-b7132a1e5b5f-ovnkube-config\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041148 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-cni-bin\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.040591 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041217 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041346 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-run-ovn\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041484 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041532 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68jdx\" (UniqueName: \"kubernetes.io/projected/f1af2033-700e-4f63-939d-b7132a1e5b5f-kube-api-access-68jdx\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041554 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041569 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-etc-openvswitch\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041597 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-run-openvswitch\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041619 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-log-socket\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041639 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-var-lib-openvswitch\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041667 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f1af2033-700e-4f63-939d-b7132a1e5b5f-env-overrides\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041704 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f1af2033-700e-4f63-939d-b7132a1e5b5f-ovnkube-script-lib\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041726 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-run-ovn-kubernetes\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041749 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-slash\") pod \"f1af2033-700e-4f63-939d-b7132a1e5b5f\" (UID: \"f1af2033-700e-4f63-939d-b7132a1e5b5f\") " Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041875 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041914 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-var-lib-openvswitch\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041942 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-run-openvswitch\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041972 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/aac79318-0b0e-4b5e-b825-f11a53187b2f-env-overrides\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042004 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-log-socket\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042035 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-run-systemd\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042054 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-cni-bin\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042102 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-run-ovn\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042143 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-run-ovn-kubernetes\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042166 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-slash\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042186 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-etc-openvswitch\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042206 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bsnq\" (UniqueName: \"kubernetes.io/projected/aac79318-0b0e-4b5e-b825-f11a53187b2f-kube-api-access-2bsnq\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042226 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-cni-netd\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042255 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/aac79318-0b0e-4b5e-b825-f11a53187b2f-ovnkube-script-lib\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042290 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/aac79318-0b0e-4b5e-b825-f11a53187b2f-ovn-node-metrics-cert\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042326 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-run-netns\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041904 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1af2033-700e-4f63-939d-b7132a1e5b5f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041929 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041950 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.041967 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-log-socket" (OuterVolumeSpecName: "log-socket") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042326 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042351 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-slash" (OuterVolumeSpecName: "host-slash") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042345 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1af2033-700e-4f63-939d-b7132a1e5b5f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042408 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042636 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-kubelet\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042717 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-node-log\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042763 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/aac79318-0b0e-4b5e-b825-f11a53187b2f-ovnkube-config\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042795 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-systemd-units\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042845 4867 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042858 4867 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042868 4867 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042862 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1af2033-700e-4f63-939d-b7132a1e5b5f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042879 4867 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042935 4867 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f1af2033-700e-4f63-939d-b7132a1e5b5f-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042963 4867 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.042991 4867 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.043016 4867 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.043040 4867 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.043065 4867 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.043090 4867 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.043114 4867 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.043139 4867 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f1af2033-700e-4f63-939d-b7132a1e5b5f-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.043162 4867 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.043185 4867 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.044533 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1af2033-700e-4f63-939d-b7132a1e5b5f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.045006 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1af2033-700e-4f63-939d-b7132a1e5b5f-kube-api-access-68jdx" (OuterVolumeSpecName: "kube-api-access-68jdx") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "kube-api-access-68jdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.045584 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.051591 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "f1af2033-700e-4f63-939d-b7132a1e5b5f" (UID: "f1af2033-700e-4f63-939d-b7132a1e5b5f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.053653 4867 scope.go:117] "RemoveContainer" containerID="68269af659165297c90238e7a18f8776fb268e39ddd21dbbae7f391687bb05ee" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.068481 4867 scope.go:117] "RemoveContainer" containerID="6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.083331 4867 scope.go:117] "RemoveContainer" containerID="3d318d14b3cb827999cd59d74599c2a5f2d19eea6a988de0a71f0871886f319a" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.096786 4867 scope.go:117] "RemoveContainer" containerID="d552fe3e6566bf7c6af0ee19811dd49817e096c6c67ce0595253e97c5186b3e6" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144459 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-slash\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144510 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-run-ovn-kubernetes\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144532 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bsnq\" (UniqueName: \"kubernetes.io/projected/aac79318-0b0e-4b5e-b825-f11a53187b2f-kube-api-access-2bsnq\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144552 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-etc-openvswitch\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144571 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-cni-netd\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144588 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/aac79318-0b0e-4b5e-b825-f11a53187b2f-ovnkube-script-lib\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144610 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-run-netns\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144624 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/aac79318-0b0e-4b5e-b825-f11a53187b2f-ovn-node-metrics-cert\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144648 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144669 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-kubelet\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144686 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-node-log\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144691 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-run-netns\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144706 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/aac79318-0b0e-4b5e-b825-f11a53187b2f-ovnkube-config\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144722 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-systemd-units\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144740 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-var-lib-openvswitch\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144740 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-cni-netd\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144794 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-systemd-units\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144777 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-run-openvswitch\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144801 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-run-ovn-kubernetes\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144827 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-var-lib-openvswitch\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144755 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-run-openvswitch\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144809 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144854 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/aac79318-0b0e-4b5e-b825-f11a53187b2f-env-overrides\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144874 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-log-socket\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144876 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-node-log\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144896 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-run-systemd\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144861 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-kubelet\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144912 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-cni-bin\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144947 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-log-socket\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144984 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-run-systemd\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.144997 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-cni-bin\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.145041 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-run-ovn\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.145129 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-run-ovn\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.145171 4867 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.145188 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68jdx\" (UniqueName: \"kubernetes.io/projected/f1af2033-700e-4f63-939d-b7132a1e5b5f-kube-api-access-68jdx\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.145206 4867 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f1af2033-700e-4f63-939d-b7132a1e5b5f-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.145221 4867 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f1af2033-700e-4f63-939d-b7132a1e5b5f-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.145235 4867 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f1af2033-700e-4f63-939d-b7132a1e5b5f-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.145333 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/aac79318-0b0e-4b5e-b825-f11a53187b2f-ovnkube-script-lib\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.145351 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/aac79318-0b0e-4b5e-b825-f11a53187b2f-ovnkube-config\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.146137 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/aac79318-0b0e-4b5e-b825-f11a53187b2f-env-overrides\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.146667 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-host-slash\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.146712 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/aac79318-0b0e-4b5e-b825-f11a53187b2f-etc-openvswitch\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.150986 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/aac79318-0b0e-4b5e-b825-f11a53187b2f-ovn-node-metrics-cert\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.174127 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bsnq\" (UniqueName: \"kubernetes.io/projected/aac79318-0b0e-4b5e-b825-f11a53187b2f-kube-api-access-2bsnq\") pod \"ovnkube-node-zglz9\" (UID: \"aac79318-0b0e-4b5e-b825-f11a53187b2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.201222 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:47 crc kubenswrapper[4867]: W0320 00:18:47.225469 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaac79318_0b0e_4b5e_b825_f11a53187b2f.slice/crio-d78beb91359e6acdf56af43196f28e97f5ef1fda20a4142ac96621010ea120cc WatchSource:0}: Error finding container d78beb91359e6acdf56af43196f28e97f5ef1fda20a4142ac96621010ea120cc: Status 404 returned error can't find the container with id d78beb91359e6acdf56af43196f28e97f5ef1fda20a4142ac96621010ea120cc Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.757516 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerDied","Data":"208681f3aa6273281ca679a75a44374c1b278262cbe4daf4bef0034b5bbabc88"} Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.757560 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerDied","Data":"a3a7b0c56ff1f6a2f63a08297a6c5f8a2f0ab668df66e7e10d4cb9240f640fa8"} Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.757574 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerDied","Data":"6d7ffc12e8bf2c2ffba653dca7de6a2f24f2c764528e7a7d0b1d847f0f499c66"} Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.757590 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" event={"ID":"f1af2033-700e-4f63-939d-b7132a1e5b5f","Type":"ContainerDied","Data":"a649631439ad5c106e63692b492367a8607628e5e3ef5401365566b3dc2961a5"} Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.760868 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-98n2n_97e52c03-2ca5-4cad-8459-f03029234544/kube-multus/1.log" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.762603 4867 generic.go:334] "Generic (PLEG): container finished" podID="aac79318-0b0e-4b5e-b825-f11a53187b2f" containerID="c3938e0a56c1d870ddf91ee378a6527aacc6345fbf9dd0e5c4734d3c1dc6bf49" exitCode=0 Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.762686 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" event={"ID":"aac79318-0b0e-4b5e-b825-f11a53187b2f","Type":"ContainerDied","Data":"c3938e0a56c1d870ddf91ee378a6527aacc6345fbf9dd0e5c4734d3c1dc6bf49"} Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.762725 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" event={"ID":"aac79318-0b0e-4b5e-b825-f11a53187b2f","Type":"ContainerStarted","Data":"d78beb91359e6acdf56af43196f28e97f5ef1fda20a4142ac96621010ea120cc"} Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.762803 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5zkft" Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.816621 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5zkft"] Mar 20 00:18:47 crc kubenswrapper[4867]: I0320 00:18:47.825808 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5zkft"] Mar 20 00:18:48 crc kubenswrapper[4867]: I0320 00:18:48.430269 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1af2033-700e-4f63-939d-b7132a1e5b5f" path="/var/lib/kubelet/pods/f1af2033-700e-4f63-939d-b7132a1e5b5f/volumes" Mar 20 00:18:48 crc kubenswrapper[4867]: I0320 00:18:48.772544 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" event={"ID":"aac79318-0b0e-4b5e-b825-f11a53187b2f","Type":"ContainerStarted","Data":"fdebe28fb53007984d71afdb24b6dfd0d5e7eb4f743156ae2fe350c3c4a70ce4"} Mar 20 00:18:48 crc kubenswrapper[4867]: I0320 00:18:48.772847 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" event={"ID":"aac79318-0b0e-4b5e-b825-f11a53187b2f","Type":"ContainerStarted","Data":"c5f7eea7e43356cec11240516a8cb4e80f6ea269956f799ee0c55f1ae9b7f9f7"} Mar 20 00:18:48 crc kubenswrapper[4867]: I0320 00:18:48.772862 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" event={"ID":"aac79318-0b0e-4b5e-b825-f11a53187b2f","Type":"ContainerStarted","Data":"505c93030769b3a697a8abda13e263997c2b91ffab8490671dcc790799a7ac66"} Mar 20 00:18:48 crc kubenswrapper[4867]: I0320 00:18:48.772875 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" event={"ID":"aac79318-0b0e-4b5e-b825-f11a53187b2f","Type":"ContainerStarted","Data":"3cd4f9a2361cf92543dce53a8e64d7db93d15c6deea6194c67fec414beaa36bf"} Mar 20 00:18:48 crc kubenswrapper[4867]: I0320 00:18:48.772885 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" event={"ID":"aac79318-0b0e-4b5e-b825-f11a53187b2f","Type":"ContainerStarted","Data":"43af78017e3ca5d2240e5df48d75810ac4511995540fbba5d92e558dc60800bc"} Mar 20 00:18:48 crc kubenswrapper[4867]: I0320 00:18:48.772897 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" event={"ID":"aac79318-0b0e-4b5e-b825-f11a53187b2f","Type":"ContainerStarted","Data":"856b7528725fbbff124ac26079f4e9013cbb5246c498c536ef10c97d2c406b7a"} Mar 20 00:18:50 crc kubenswrapper[4867]: I0320 00:18:50.790286 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" event={"ID":"aac79318-0b0e-4b5e-b825-f11a53187b2f","Type":"ContainerStarted","Data":"a3b9e6fb2be424626d6d67074b2b6c5648c2fa8380f525b482aac956d48f3e3f"} Mar 20 00:18:53 crc kubenswrapper[4867]: I0320 00:18:53.812271 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" event={"ID":"aac79318-0b0e-4b5e-b825-f11a53187b2f","Type":"ContainerStarted","Data":"7b2eca3da63d9b0d7b2b9c17e2d8958c2bd46779ee895380522526f2e8617a4a"} Mar 20 00:18:53 crc kubenswrapper[4867]: I0320 00:18:53.812902 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:53 crc kubenswrapper[4867]: I0320 00:18:53.812929 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:53 crc kubenswrapper[4867]: I0320 00:18:53.812948 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:53 crc kubenswrapper[4867]: I0320 00:18:53.857669 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" podStartSLOduration=7.857652796 podStartE2EDuration="7.857652796s" podCreationTimestamp="2026-03-20 00:18:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:18:53.848061017 +0000 UTC m=+748.074598544" watchObservedRunningTime="2026-03-20 00:18:53.857652796 +0000 UTC m=+748.084190303" Mar 20 00:18:53 crc kubenswrapper[4867]: I0320 00:18:53.863482 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:18:53 crc kubenswrapper[4867]: I0320 00:18:53.864556 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:19:00 crc kubenswrapper[4867]: I0320 00:19:00.421016 4867 scope.go:117] "RemoveContainer" containerID="94ac4d4ff5d940e0e8e7fb098074879aa2d41ab5b5b0e51d81d50801b1ff8b3f" Mar 20 00:19:00 crc kubenswrapper[4867]: I0320 00:19:00.860121 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-98n2n_97e52c03-2ca5-4cad-8459-f03029234544/kube-multus/1.log" Mar 20 00:19:00 crc kubenswrapper[4867]: I0320 00:19:00.860541 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-98n2n" event={"ID":"97e52c03-2ca5-4cad-8459-f03029234544","Type":"ContainerStarted","Data":"23b64216e7dee3e43658c0b5ebd1ce6256932cf6d943f873d2fb4424b74d3d42"} Mar 20 00:19:17 crc kubenswrapper[4867]: I0320 00:19:17.237140 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zglz9" Mar 20 00:19:51 crc kubenswrapper[4867]: I0320 00:19:51.836599 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rfsss"] Mar 20 00:19:51 crc kubenswrapper[4867]: I0320 00:19:51.837787 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rfsss" podUID="9dd55f5a-599d-4769-8ddc-7983b0042236" containerName="registry-server" containerID="cri-o://185c5ef02b6db043e3cde22599d4b402515167e14277f36115e5ebd2ef3e5c0e" gracePeriod=30 Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.217096 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rfsss" Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.232331 4867 generic.go:334] "Generic (PLEG): container finished" podID="9dd55f5a-599d-4769-8ddc-7983b0042236" containerID="185c5ef02b6db043e3cde22599d4b402515167e14277f36115e5ebd2ef3e5c0e" exitCode=0 Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.232420 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rfsss" event={"ID":"9dd55f5a-599d-4769-8ddc-7983b0042236","Type":"ContainerDied","Data":"185c5ef02b6db043e3cde22599d4b402515167e14277f36115e5ebd2ef3e5c0e"} Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.232518 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rfsss" Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.232532 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rfsss" event={"ID":"9dd55f5a-599d-4769-8ddc-7983b0042236","Type":"ContainerDied","Data":"9f97313a2f7c7ec55d7f58531ab832e0beab90a348db0a5a3d6893dd09828ed9"} Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.232564 4867 scope.go:117] "RemoveContainer" containerID="185c5ef02b6db043e3cde22599d4b402515167e14277f36115e5ebd2ef3e5c0e" Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.257068 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd55f5a-599d-4769-8ddc-7983b0042236-utilities\") pod \"9dd55f5a-599d-4769-8ddc-7983b0042236\" (UID: \"9dd55f5a-599d-4769-8ddc-7983b0042236\") " Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.257317 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22xph\" (UniqueName: \"kubernetes.io/projected/9dd55f5a-599d-4769-8ddc-7983b0042236-kube-api-access-22xph\") pod \"9dd55f5a-599d-4769-8ddc-7983b0042236\" (UID: \"9dd55f5a-599d-4769-8ddc-7983b0042236\") " Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.257453 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd55f5a-599d-4769-8ddc-7983b0042236-catalog-content\") pod \"9dd55f5a-599d-4769-8ddc-7983b0042236\" (UID: \"9dd55f5a-599d-4769-8ddc-7983b0042236\") " Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.260975 4867 scope.go:117] "RemoveContainer" containerID="0d12d6f72c0b279e1494d8d388678f5f82a265a053ae067ce5bfad5bdaf4baa5" Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.261683 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dd55f5a-599d-4769-8ddc-7983b0042236-utilities" (OuterVolumeSpecName: "utilities") pod "9dd55f5a-599d-4769-8ddc-7983b0042236" (UID: "9dd55f5a-599d-4769-8ddc-7983b0042236"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.267170 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dd55f5a-599d-4769-8ddc-7983b0042236-kube-api-access-22xph" (OuterVolumeSpecName: "kube-api-access-22xph") pod "9dd55f5a-599d-4769-8ddc-7983b0042236" (UID: "9dd55f5a-599d-4769-8ddc-7983b0042236"). InnerVolumeSpecName "kube-api-access-22xph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.294091 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dd55f5a-599d-4769-8ddc-7983b0042236-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9dd55f5a-599d-4769-8ddc-7983b0042236" (UID: "9dd55f5a-599d-4769-8ddc-7983b0042236"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.313075 4867 scope.go:117] "RemoveContainer" containerID="250d5d1db3fd67e4316334db57b6dff2699b3e4ccb51cbd567ae6d03e498b8cd" Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.341982 4867 scope.go:117] "RemoveContainer" containerID="185c5ef02b6db043e3cde22599d4b402515167e14277f36115e5ebd2ef3e5c0e" Mar 20 00:19:52 crc kubenswrapper[4867]: E0320 00:19:52.342535 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"185c5ef02b6db043e3cde22599d4b402515167e14277f36115e5ebd2ef3e5c0e\": container with ID starting with 185c5ef02b6db043e3cde22599d4b402515167e14277f36115e5ebd2ef3e5c0e not found: ID does not exist" containerID="185c5ef02b6db043e3cde22599d4b402515167e14277f36115e5ebd2ef3e5c0e" Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.342583 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185c5ef02b6db043e3cde22599d4b402515167e14277f36115e5ebd2ef3e5c0e"} err="failed to get container status \"185c5ef02b6db043e3cde22599d4b402515167e14277f36115e5ebd2ef3e5c0e\": rpc error: code = NotFound desc = could not find container \"185c5ef02b6db043e3cde22599d4b402515167e14277f36115e5ebd2ef3e5c0e\": container with ID starting with 185c5ef02b6db043e3cde22599d4b402515167e14277f36115e5ebd2ef3e5c0e not found: ID does not exist" Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.342614 4867 scope.go:117] "RemoveContainer" containerID="0d12d6f72c0b279e1494d8d388678f5f82a265a053ae067ce5bfad5bdaf4baa5" Mar 20 00:19:52 crc kubenswrapper[4867]: E0320 00:19:52.343023 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d12d6f72c0b279e1494d8d388678f5f82a265a053ae067ce5bfad5bdaf4baa5\": container with ID starting with 0d12d6f72c0b279e1494d8d388678f5f82a265a053ae067ce5bfad5bdaf4baa5 not found: ID does not exist" containerID="0d12d6f72c0b279e1494d8d388678f5f82a265a053ae067ce5bfad5bdaf4baa5" Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.343061 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d12d6f72c0b279e1494d8d388678f5f82a265a053ae067ce5bfad5bdaf4baa5"} err="failed to get container status \"0d12d6f72c0b279e1494d8d388678f5f82a265a053ae067ce5bfad5bdaf4baa5\": rpc error: code = NotFound desc = could not find container \"0d12d6f72c0b279e1494d8d388678f5f82a265a053ae067ce5bfad5bdaf4baa5\": container with ID starting with 0d12d6f72c0b279e1494d8d388678f5f82a265a053ae067ce5bfad5bdaf4baa5 not found: ID does not exist" Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.343085 4867 scope.go:117] "RemoveContainer" containerID="250d5d1db3fd67e4316334db57b6dff2699b3e4ccb51cbd567ae6d03e498b8cd" Mar 20 00:19:52 crc kubenswrapper[4867]: E0320 00:19:52.343386 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"250d5d1db3fd67e4316334db57b6dff2699b3e4ccb51cbd567ae6d03e498b8cd\": container with ID starting with 250d5d1db3fd67e4316334db57b6dff2699b3e4ccb51cbd567ae6d03e498b8cd not found: ID does not exist" containerID="250d5d1db3fd67e4316334db57b6dff2699b3e4ccb51cbd567ae6d03e498b8cd" Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.343417 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250d5d1db3fd67e4316334db57b6dff2699b3e4ccb51cbd567ae6d03e498b8cd"} err="failed to get container status \"250d5d1db3fd67e4316334db57b6dff2699b3e4ccb51cbd567ae6d03e498b8cd\": rpc error: code = NotFound desc = could not find container \"250d5d1db3fd67e4316334db57b6dff2699b3e4ccb51cbd567ae6d03e498b8cd\": container with ID starting with 250d5d1db3fd67e4316334db57b6dff2699b3e4ccb51cbd567ae6d03e498b8cd not found: ID does not exist" Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.359606 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22xph\" (UniqueName: \"kubernetes.io/projected/9dd55f5a-599d-4769-8ddc-7983b0042236-kube-api-access-22xph\") on node \"crc\" DevicePath \"\"" Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.359634 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd55f5a-599d-4769-8ddc-7983b0042236-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.359647 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd55f5a-599d-4769-8ddc-7983b0042236-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.553658 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rfsss"] Mar 20 00:19:52 crc kubenswrapper[4867]: I0320 00:19:52.561414 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rfsss"] Mar 20 00:19:54 crc kubenswrapper[4867]: I0320 00:19:54.430570 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dd55f5a-599d-4769-8ddc-7983b0042236" path="/var/lib/kubelet/pods/9dd55f5a-599d-4769-8ddc-7983b0042236/volumes" Mar 20 00:19:55 crc kubenswrapper[4867]: I0320 00:19:55.606859 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf"] Mar 20 00:19:55 crc kubenswrapper[4867]: E0320 00:19:55.607547 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd55f5a-599d-4769-8ddc-7983b0042236" containerName="extract-content" Mar 20 00:19:55 crc kubenswrapper[4867]: I0320 00:19:55.607569 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd55f5a-599d-4769-8ddc-7983b0042236" containerName="extract-content" Mar 20 00:19:55 crc kubenswrapper[4867]: E0320 00:19:55.607595 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd55f5a-599d-4769-8ddc-7983b0042236" containerName="extract-utilities" Mar 20 00:19:55 crc kubenswrapper[4867]: I0320 00:19:55.607609 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd55f5a-599d-4769-8ddc-7983b0042236" containerName="extract-utilities" Mar 20 00:19:55 crc kubenswrapper[4867]: E0320 00:19:55.607631 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd55f5a-599d-4769-8ddc-7983b0042236" containerName="registry-server" Mar 20 00:19:55 crc kubenswrapper[4867]: I0320 00:19:55.607644 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd55f5a-599d-4769-8ddc-7983b0042236" containerName="registry-server" Mar 20 00:19:55 crc kubenswrapper[4867]: I0320 00:19:55.607814 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dd55f5a-599d-4769-8ddc-7983b0042236" containerName="registry-server" Mar 20 00:19:55 crc kubenswrapper[4867]: I0320 00:19:55.609014 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf" Mar 20 00:19:55 crc kubenswrapper[4867]: I0320 00:19:55.613777 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 00:19:55 crc kubenswrapper[4867]: I0320 00:19:55.623580 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf"] Mar 20 00:19:55 crc kubenswrapper[4867]: I0320 00:19:55.698722 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4438858b-60d5-4652-842c-c007bad8b04f-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf\" (UID: \"4438858b-60d5-4652-842c-c007bad8b04f\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf" Mar 20 00:19:55 crc kubenswrapper[4867]: I0320 00:19:55.698793 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4438858b-60d5-4652-842c-c007bad8b04f-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf\" (UID: \"4438858b-60d5-4652-842c-c007bad8b04f\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf" Mar 20 00:19:55 crc kubenswrapper[4867]: I0320 00:19:55.698842 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7tzj\" (UniqueName: \"kubernetes.io/projected/4438858b-60d5-4652-842c-c007bad8b04f-kube-api-access-v7tzj\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf\" (UID: \"4438858b-60d5-4652-842c-c007bad8b04f\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf" Mar 20 00:19:55 crc kubenswrapper[4867]: I0320 00:19:55.800073 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7tzj\" (UniqueName: \"kubernetes.io/projected/4438858b-60d5-4652-842c-c007bad8b04f-kube-api-access-v7tzj\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf\" (UID: \"4438858b-60d5-4652-842c-c007bad8b04f\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf" Mar 20 00:19:55 crc kubenswrapper[4867]: I0320 00:19:55.800476 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4438858b-60d5-4652-842c-c007bad8b04f-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf\" (UID: \"4438858b-60d5-4652-842c-c007bad8b04f\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf" Mar 20 00:19:55 crc kubenswrapper[4867]: I0320 00:19:55.800613 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4438858b-60d5-4652-842c-c007bad8b04f-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf\" (UID: \"4438858b-60d5-4652-842c-c007bad8b04f\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf" Mar 20 00:19:55 crc kubenswrapper[4867]: I0320 00:19:55.801135 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4438858b-60d5-4652-842c-c007bad8b04f-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf\" (UID: \"4438858b-60d5-4652-842c-c007bad8b04f\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf" Mar 20 00:19:55 crc kubenswrapper[4867]: I0320 00:19:55.801133 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4438858b-60d5-4652-842c-c007bad8b04f-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf\" (UID: \"4438858b-60d5-4652-842c-c007bad8b04f\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf" Mar 20 00:19:55 crc kubenswrapper[4867]: I0320 00:19:55.823253 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7tzj\" (UniqueName: \"kubernetes.io/projected/4438858b-60d5-4652-842c-c007bad8b04f-kube-api-access-v7tzj\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf\" (UID: \"4438858b-60d5-4652-842c-c007bad8b04f\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf" Mar 20 00:19:55 crc kubenswrapper[4867]: I0320 00:19:55.931294 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf" Mar 20 00:19:56 crc kubenswrapper[4867]: I0320 00:19:56.174404 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf"] Mar 20 00:19:56 crc kubenswrapper[4867]: I0320 00:19:56.257218 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf" event={"ID":"4438858b-60d5-4652-842c-c007bad8b04f","Type":"ContainerStarted","Data":"64fe5d3ad40d1df6810339aac7ef8abf403e987d32bf9078a8256dc74fe9ce45"} Mar 20 00:19:57 crc kubenswrapper[4867]: I0320 00:19:57.267629 4867 generic.go:334] "Generic (PLEG): container finished" podID="4438858b-60d5-4652-842c-c007bad8b04f" containerID="0c9a052c2d9a4a2bb635055625b9c779c15d2220220236b6138fde8bebae8028" exitCode=0 Mar 20 00:19:57 crc kubenswrapper[4867]: I0320 00:19:57.267697 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf" event={"ID":"4438858b-60d5-4652-842c-c007bad8b04f","Type":"ContainerDied","Data":"0c9a052c2d9a4a2bb635055625b9c779c15d2220220236b6138fde8bebae8028"} Mar 20 00:19:59 crc kubenswrapper[4867]: I0320 00:19:59.285851 4867 generic.go:334] "Generic (PLEG): container finished" podID="4438858b-60d5-4652-842c-c007bad8b04f" containerID="e1bada9f558294b227328079f3bfe5577275c309484da30d0323e3fc6e1a3cd7" exitCode=0 Mar 20 00:19:59 crc kubenswrapper[4867]: I0320 00:19:59.285914 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf" event={"ID":"4438858b-60d5-4652-842c-c007bad8b04f","Type":"ContainerDied","Data":"e1bada9f558294b227328079f3bfe5577275c309484da30d0323e3fc6e1a3cd7"} Mar 20 00:20:00 crc kubenswrapper[4867]: I0320 00:20:00.147355 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566100-76td9"] Mar 20 00:20:00 crc kubenswrapper[4867]: I0320 00:20:00.149099 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566100-76td9" Mar 20 00:20:00 crc kubenswrapper[4867]: I0320 00:20:00.152387 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 00:20:00 crc kubenswrapper[4867]: I0320 00:20:00.152851 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 00:20:00 crc kubenswrapper[4867]: I0320 00:20:00.154111 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lxzm5" Mar 20 00:20:00 crc kubenswrapper[4867]: I0320 00:20:00.165462 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht4p5\" (UniqueName: \"kubernetes.io/projected/981c2ac7-a232-4a5c-a491-face5b8e5ad2-kube-api-access-ht4p5\") pod \"auto-csr-approver-29566100-76td9\" (UID: \"981c2ac7-a232-4a5c-a491-face5b8e5ad2\") " pod="openshift-infra/auto-csr-approver-29566100-76td9" Mar 20 00:20:00 crc kubenswrapper[4867]: I0320 00:20:00.168689 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566100-76td9"] Mar 20 00:20:00 crc kubenswrapper[4867]: I0320 00:20:00.266841 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht4p5\" (UniqueName: \"kubernetes.io/projected/981c2ac7-a232-4a5c-a491-face5b8e5ad2-kube-api-access-ht4p5\") pod \"auto-csr-approver-29566100-76td9\" (UID: \"981c2ac7-a232-4a5c-a491-face5b8e5ad2\") " pod="openshift-infra/auto-csr-approver-29566100-76td9" Mar 20 00:20:00 crc kubenswrapper[4867]: I0320 00:20:00.295615 4867 generic.go:334] "Generic (PLEG): container finished" podID="4438858b-60d5-4652-842c-c007bad8b04f" containerID="0c3260f317b1457bb7fc686157057390f99251fd0e9bb3631dfa9d6afbcd9b0f" exitCode=0 Mar 20 00:20:00 crc kubenswrapper[4867]: I0320 00:20:00.295676 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf" event={"ID":"4438858b-60d5-4652-842c-c007bad8b04f","Type":"ContainerDied","Data":"0c3260f317b1457bb7fc686157057390f99251fd0e9bb3631dfa9d6afbcd9b0f"} Mar 20 00:20:00 crc kubenswrapper[4867]: I0320 00:20:00.301097 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht4p5\" (UniqueName: \"kubernetes.io/projected/981c2ac7-a232-4a5c-a491-face5b8e5ad2-kube-api-access-ht4p5\") pod \"auto-csr-approver-29566100-76td9\" (UID: \"981c2ac7-a232-4a5c-a491-face5b8e5ad2\") " pod="openshift-infra/auto-csr-approver-29566100-76td9" Mar 20 00:20:00 crc kubenswrapper[4867]: I0320 00:20:00.471316 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566100-76td9" Mar 20 00:20:00 crc kubenswrapper[4867]: I0320 00:20:00.725955 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566100-76td9"] Mar 20 00:20:00 crc kubenswrapper[4867]: W0320 00:20:00.730602 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod981c2ac7_a232_4a5c_a491_face5b8e5ad2.slice/crio-0385f420fbe7f0165d8e34669ea142eb065fe50775b220e7b8f21476d051ffd2 WatchSource:0}: Error finding container 0385f420fbe7f0165d8e34669ea142eb065fe50775b220e7b8f21476d051ffd2: Status 404 returned error can't find the container with id 0385f420fbe7f0165d8e34669ea142eb065fe50775b220e7b8f21476d051ffd2 Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.303672 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566100-76td9" event={"ID":"981c2ac7-a232-4a5c-a491-face5b8e5ad2","Type":"ContainerStarted","Data":"0385f420fbe7f0165d8e34669ea142eb065fe50775b220e7b8f21476d051ffd2"} Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.548434 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.581672 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4438858b-60d5-4652-842c-c007bad8b04f-util\") pod \"4438858b-60d5-4652-842c-c007bad8b04f\" (UID: \"4438858b-60d5-4652-842c-c007bad8b04f\") " Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.581843 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4438858b-60d5-4652-842c-c007bad8b04f-bundle\") pod \"4438858b-60d5-4652-842c-c007bad8b04f\" (UID: \"4438858b-60d5-4652-842c-c007bad8b04f\") " Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.582100 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7tzj\" (UniqueName: \"kubernetes.io/projected/4438858b-60d5-4652-842c-c007bad8b04f-kube-api-access-v7tzj\") pod \"4438858b-60d5-4652-842c-c007bad8b04f\" (UID: \"4438858b-60d5-4652-842c-c007bad8b04f\") " Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.587719 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4438858b-60d5-4652-842c-c007bad8b04f-bundle" (OuterVolumeSpecName: "bundle") pod "4438858b-60d5-4652-842c-c007bad8b04f" (UID: "4438858b-60d5-4652-842c-c007bad8b04f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.589272 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4438858b-60d5-4652-842c-c007bad8b04f-kube-api-access-v7tzj" (OuterVolumeSpecName: "kube-api-access-v7tzj") pod "4438858b-60d5-4652-842c-c007bad8b04f" (UID: "4438858b-60d5-4652-842c-c007bad8b04f"). InnerVolumeSpecName "kube-api-access-v7tzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.611142 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj"] Mar 20 00:20:01 crc kubenswrapper[4867]: E0320 00:20:01.611398 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4438858b-60d5-4652-842c-c007bad8b04f" containerName="pull" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.611411 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4438858b-60d5-4652-842c-c007bad8b04f" containerName="pull" Mar 20 00:20:01 crc kubenswrapper[4867]: E0320 00:20:01.611426 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4438858b-60d5-4652-842c-c007bad8b04f" containerName="util" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.611434 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4438858b-60d5-4652-842c-c007bad8b04f" containerName="util" Mar 20 00:20:01 crc kubenswrapper[4867]: E0320 00:20:01.611446 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4438858b-60d5-4652-842c-c007bad8b04f" containerName="extract" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.611455 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4438858b-60d5-4652-842c-c007bad8b04f" containerName="extract" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.611589 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4438858b-60d5-4652-842c-c007bad8b04f" containerName="extract" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.612436 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.623086 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj"] Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.683557 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrdpc\" (UniqueName: \"kubernetes.io/projected/918d65ef-006f-4f3a-8736-6af79633c18b-kube-api-access-lrdpc\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj\" (UID: \"918d65ef-006f-4f3a-8736-6af79633c18b\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.683613 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/918d65ef-006f-4f3a-8736-6af79633c18b-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj\" (UID: \"918d65ef-006f-4f3a-8736-6af79633c18b\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.683694 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/918d65ef-006f-4f3a-8736-6af79633c18b-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj\" (UID: \"918d65ef-006f-4f3a-8736-6af79633c18b\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.683791 4867 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4438858b-60d5-4652-842c-c007bad8b04f-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.683803 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7tzj\" (UniqueName: \"kubernetes.io/projected/4438858b-60d5-4652-842c-c007bad8b04f-kube-api-access-v7tzj\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.768772 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4438858b-60d5-4652-842c-c007bad8b04f-util" (OuterVolumeSpecName: "util") pod "4438858b-60d5-4652-842c-c007bad8b04f" (UID: "4438858b-60d5-4652-842c-c007bad8b04f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.784311 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrdpc\" (UniqueName: \"kubernetes.io/projected/918d65ef-006f-4f3a-8736-6af79633c18b-kube-api-access-lrdpc\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj\" (UID: \"918d65ef-006f-4f3a-8736-6af79633c18b\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.784364 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/918d65ef-006f-4f3a-8736-6af79633c18b-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj\" (UID: \"918d65ef-006f-4f3a-8736-6af79633c18b\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.784396 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/918d65ef-006f-4f3a-8736-6af79633c18b-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj\" (UID: \"918d65ef-006f-4f3a-8736-6af79633c18b\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.784452 4867 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4438858b-60d5-4652-842c-c007bad8b04f-util\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.784826 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/918d65ef-006f-4f3a-8736-6af79633c18b-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj\" (UID: \"918d65ef-006f-4f3a-8736-6af79633c18b\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.785900 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/918d65ef-006f-4f3a-8736-6af79633c18b-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj\" (UID: \"918d65ef-006f-4f3a-8736-6af79633c18b\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.804256 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrdpc\" (UniqueName: \"kubernetes.io/projected/918d65ef-006f-4f3a-8736-6af79633c18b-kube-api-access-lrdpc\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj\" (UID: \"918d65ef-006f-4f3a-8736-6af79633c18b\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj" Mar 20 00:20:01 crc kubenswrapper[4867]: I0320 00:20:01.932173 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj" Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.232190 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj"] Mar 20 00:20:02 crc kubenswrapper[4867]: W0320 00:20:02.238032 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod918d65ef_006f_4f3a_8736_6af79633c18b.slice/crio-4d0064421a6e6fadaed2a370e577d0b620e2dc73269a875942384a7d471e2050 WatchSource:0}: Error finding container 4d0064421a6e6fadaed2a370e577d0b620e2dc73269a875942384a7d471e2050: Status 404 returned error can't find the container with id 4d0064421a6e6fadaed2a370e577d0b620e2dc73269a875942384a7d471e2050 Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.311708 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj" event={"ID":"918d65ef-006f-4f3a-8736-6af79633c18b","Type":"ContainerStarted","Data":"4d0064421a6e6fadaed2a370e577d0b620e2dc73269a875942384a7d471e2050"} Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.315422 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf" event={"ID":"4438858b-60d5-4652-842c-c007bad8b04f","Type":"ContainerDied","Data":"64fe5d3ad40d1df6810339aac7ef8abf403e987d32bf9078a8256dc74fe9ce45"} Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.315442 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64fe5d3ad40d1df6810339aac7ef8abf403e987d32bf9078a8256dc74fe9ce45" Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.315575 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf" Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.408447 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw"] Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.414900 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw" Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.422713 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be85d851-fa19-4a82-b64c-80a9a261b53b-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw\" (UID: \"be85d851-fa19-4a82-b64c-80a9a261b53b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw" Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.422785 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbr5x\" (UniqueName: \"kubernetes.io/projected/be85d851-fa19-4a82-b64c-80a9a261b53b-kube-api-access-qbr5x\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw\" (UID: \"be85d851-fa19-4a82-b64c-80a9a261b53b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw" Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.422869 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be85d851-fa19-4a82-b64c-80a9a261b53b-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw\" (UID: \"be85d851-fa19-4a82-b64c-80a9a261b53b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw" Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.437874 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw"] Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.524167 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbr5x\" (UniqueName: \"kubernetes.io/projected/be85d851-fa19-4a82-b64c-80a9a261b53b-kube-api-access-qbr5x\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw\" (UID: \"be85d851-fa19-4a82-b64c-80a9a261b53b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw" Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.524232 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be85d851-fa19-4a82-b64c-80a9a261b53b-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw\" (UID: \"be85d851-fa19-4a82-b64c-80a9a261b53b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw" Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.524315 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be85d851-fa19-4a82-b64c-80a9a261b53b-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw\" (UID: \"be85d851-fa19-4a82-b64c-80a9a261b53b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw" Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.525256 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be85d851-fa19-4a82-b64c-80a9a261b53b-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw\" (UID: \"be85d851-fa19-4a82-b64c-80a9a261b53b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw" Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.525278 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be85d851-fa19-4a82-b64c-80a9a261b53b-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw\" (UID: \"be85d851-fa19-4a82-b64c-80a9a261b53b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw" Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.548013 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbr5x\" (UniqueName: \"kubernetes.io/projected/be85d851-fa19-4a82-b64c-80a9a261b53b-kube-api-access-qbr5x\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw\" (UID: \"be85d851-fa19-4a82-b64c-80a9a261b53b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw" Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.748715 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw" Mar 20 00:20:02 crc kubenswrapper[4867]: I0320 00:20:02.970356 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw"] Mar 20 00:20:03 crc kubenswrapper[4867]: I0320 00:20:03.323054 4867 generic.go:334] "Generic (PLEG): container finished" podID="918d65ef-006f-4f3a-8736-6af79633c18b" containerID="166348f953b296a0dc8fe38c51ef597932ab57c46152f7c008518131dfe82e77" exitCode=0 Mar 20 00:20:03 crc kubenswrapper[4867]: I0320 00:20:03.323120 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj" event={"ID":"918d65ef-006f-4f3a-8736-6af79633c18b","Type":"ContainerDied","Data":"166348f953b296a0dc8fe38c51ef597932ab57c46152f7c008518131dfe82e77"} Mar 20 00:20:03 crc kubenswrapper[4867]: I0320 00:20:03.326309 4867 generic.go:334] "Generic (PLEG): container finished" podID="be85d851-fa19-4a82-b64c-80a9a261b53b" containerID="99827cf810ff56d305139df1543a012c1c65eb9ac65d6b906f0055836ca2f8ab" exitCode=0 Mar 20 00:20:03 crc kubenswrapper[4867]: I0320 00:20:03.326415 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw" event={"ID":"be85d851-fa19-4a82-b64c-80a9a261b53b","Type":"ContainerDied","Data":"99827cf810ff56d305139df1543a012c1c65eb9ac65d6b906f0055836ca2f8ab"} Mar 20 00:20:03 crc kubenswrapper[4867]: I0320 00:20:03.326543 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw" event={"ID":"be85d851-fa19-4a82-b64c-80a9a261b53b","Type":"ContainerStarted","Data":"8c5175a4caf0084ed5b71dec086b47d9a9ae3a4e0e483b2143380e2b4f8d7b5c"} Mar 20 00:20:03 crc kubenswrapper[4867]: I0320 00:20:03.328652 4867 generic.go:334] "Generic (PLEG): container finished" podID="981c2ac7-a232-4a5c-a491-face5b8e5ad2" containerID="cdec02430ca48a3d022c151b34fc5df1175068efe41f849e8db5e70b3c070632" exitCode=0 Mar 20 00:20:03 crc kubenswrapper[4867]: I0320 00:20:03.328696 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566100-76td9" event={"ID":"981c2ac7-a232-4a5c-a491-face5b8e5ad2","Type":"ContainerDied","Data":"cdec02430ca48a3d022c151b34fc5df1175068efe41f849e8db5e70b3c070632"} Mar 20 00:20:04 crc kubenswrapper[4867]: I0320 00:20:04.638892 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566100-76td9" Mar 20 00:20:04 crc kubenswrapper[4867]: I0320 00:20:04.755593 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht4p5\" (UniqueName: \"kubernetes.io/projected/981c2ac7-a232-4a5c-a491-face5b8e5ad2-kube-api-access-ht4p5\") pod \"981c2ac7-a232-4a5c-a491-face5b8e5ad2\" (UID: \"981c2ac7-a232-4a5c-a491-face5b8e5ad2\") " Mar 20 00:20:04 crc kubenswrapper[4867]: I0320 00:20:04.763065 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/981c2ac7-a232-4a5c-a491-face5b8e5ad2-kube-api-access-ht4p5" (OuterVolumeSpecName: "kube-api-access-ht4p5") pod "981c2ac7-a232-4a5c-a491-face5b8e5ad2" (UID: "981c2ac7-a232-4a5c-a491-face5b8e5ad2"). InnerVolumeSpecName "kube-api-access-ht4p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:20:04 crc kubenswrapper[4867]: I0320 00:20:04.856841 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht4p5\" (UniqueName: \"kubernetes.io/projected/981c2ac7-a232-4a5c-a491-face5b8e5ad2-kube-api-access-ht4p5\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:05 crc kubenswrapper[4867]: I0320 00:20:05.351674 4867 generic.go:334] "Generic (PLEG): container finished" podID="918d65ef-006f-4f3a-8736-6af79633c18b" containerID="a75659b6cf077dd37756e0856d7eb543367b85a69f67186de22f99170415c495" exitCode=0 Mar 20 00:20:05 crc kubenswrapper[4867]: I0320 00:20:05.352089 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj" event={"ID":"918d65ef-006f-4f3a-8736-6af79633c18b","Type":"ContainerDied","Data":"a75659b6cf077dd37756e0856d7eb543367b85a69f67186de22f99170415c495"} Mar 20 00:20:05 crc kubenswrapper[4867]: I0320 00:20:05.358825 4867 generic.go:334] "Generic (PLEG): container finished" podID="be85d851-fa19-4a82-b64c-80a9a261b53b" containerID="4d015441c6471d5d9fc026523095cf2cd955621a27e123e0968eda59d26d44ec" exitCode=0 Mar 20 00:20:05 crc kubenswrapper[4867]: I0320 00:20:05.358965 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw" event={"ID":"be85d851-fa19-4a82-b64c-80a9a261b53b","Type":"ContainerDied","Data":"4d015441c6471d5d9fc026523095cf2cd955621a27e123e0968eda59d26d44ec"} Mar 20 00:20:05 crc kubenswrapper[4867]: I0320 00:20:05.362688 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566100-76td9" event={"ID":"981c2ac7-a232-4a5c-a491-face5b8e5ad2","Type":"ContainerDied","Data":"0385f420fbe7f0165d8e34669ea142eb065fe50775b220e7b8f21476d051ffd2"} Mar 20 00:20:05 crc kubenswrapper[4867]: I0320 00:20:05.362722 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0385f420fbe7f0165d8e34669ea142eb065fe50775b220e7b8f21476d051ffd2" Mar 20 00:20:05 crc kubenswrapper[4867]: I0320 00:20:05.362769 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566100-76td9" Mar 20 00:20:05 crc kubenswrapper[4867]: I0320 00:20:05.700311 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566094-dfkrv"] Mar 20 00:20:05 crc kubenswrapper[4867]: I0320 00:20:05.705662 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566094-dfkrv"] Mar 20 00:20:06 crc kubenswrapper[4867]: I0320 00:20:06.370222 4867 generic.go:334] "Generic (PLEG): container finished" podID="918d65ef-006f-4f3a-8736-6af79633c18b" containerID="40d9ce21aa4a7d8da1d9bcdcd57d0f807326de0bdecde0e5ed2481cb794003e0" exitCode=0 Mar 20 00:20:06 crc kubenswrapper[4867]: I0320 00:20:06.370286 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj" event={"ID":"918d65ef-006f-4f3a-8736-6af79633c18b","Type":"ContainerDied","Data":"40d9ce21aa4a7d8da1d9bcdcd57d0f807326de0bdecde0e5ed2481cb794003e0"} Mar 20 00:20:06 crc kubenswrapper[4867]: I0320 00:20:06.373289 4867 generic.go:334] "Generic (PLEG): container finished" podID="be85d851-fa19-4a82-b64c-80a9a261b53b" containerID="c8d09e790bc417e6bb9e4c3256bf3c3d318c4357f71484f018f19ae548f5fb51" exitCode=0 Mar 20 00:20:06 crc kubenswrapper[4867]: I0320 00:20:06.373314 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw" event={"ID":"be85d851-fa19-4a82-b64c-80a9a261b53b","Type":"ContainerDied","Data":"c8d09e790bc417e6bb9e4c3256bf3c3d318c4357f71484f018f19ae548f5fb51"} Mar 20 00:20:06 crc kubenswrapper[4867]: I0320 00:20:06.426960 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d8cf962-f825-4ef0-98d1-8648fc43f979" path="/var/lib/kubelet/pods/2d8cf962-f825-4ef0-98d1-8648fc43f979/volumes" Mar 20 00:20:07 crc kubenswrapper[4867]: I0320 00:20:07.801232 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj" Mar 20 00:20:07 crc kubenswrapper[4867]: I0320 00:20:07.806367 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw" Mar 20 00:20:07 crc kubenswrapper[4867]: I0320 00:20:07.993263 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/918d65ef-006f-4f3a-8736-6af79633c18b-util\") pod \"918d65ef-006f-4f3a-8736-6af79633c18b\" (UID: \"918d65ef-006f-4f3a-8736-6af79633c18b\") " Mar 20 00:20:07 crc kubenswrapper[4867]: I0320 00:20:07.993412 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be85d851-fa19-4a82-b64c-80a9a261b53b-bundle\") pod \"be85d851-fa19-4a82-b64c-80a9a261b53b\" (UID: \"be85d851-fa19-4a82-b64c-80a9a261b53b\") " Mar 20 00:20:07 crc kubenswrapper[4867]: I0320 00:20:07.993500 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrdpc\" (UniqueName: \"kubernetes.io/projected/918d65ef-006f-4f3a-8736-6af79633c18b-kube-api-access-lrdpc\") pod \"918d65ef-006f-4f3a-8736-6af79633c18b\" (UID: \"918d65ef-006f-4f3a-8736-6af79633c18b\") " Mar 20 00:20:07 crc kubenswrapper[4867]: I0320 00:20:07.993563 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbr5x\" (UniqueName: \"kubernetes.io/projected/be85d851-fa19-4a82-b64c-80a9a261b53b-kube-api-access-qbr5x\") pod \"be85d851-fa19-4a82-b64c-80a9a261b53b\" (UID: \"be85d851-fa19-4a82-b64c-80a9a261b53b\") " Mar 20 00:20:07 crc kubenswrapper[4867]: I0320 00:20:07.993599 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/918d65ef-006f-4f3a-8736-6af79633c18b-bundle\") pod \"918d65ef-006f-4f3a-8736-6af79633c18b\" (UID: \"918d65ef-006f-4f3a-8736-6af79633c18b\") " Mar 20 00:20:07 crc kubenswrapper[4867]: I0320 00:20:07.993624 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be85d851-fa19-4a82-b64c-80a9a261b53b-util\") pod \"be85d851-fa19-4a82-b64c-80a9a261b53b\" (UID: \"be85d851-fa19-4a82-b64c-80a9a261b53b\") " Mar 20 00:20:07 crc kubenswrapper[4867]: I0320 00:20:07.994278 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/918d65ef-006f-4f3a-8736-6af79633c18b-bundle" (OuterVolumeSpecName: "bundle") pod "918d65ef-006f-4f3a-8736-6af79633c18b" (UID: "918d65ef-006f-4f3a-8736-6af79633c18b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:20:07 crc kubenswrapper[4867]: I0320 00:20:07.994432 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be85d851-fa19-4a82-b64c-80a9a261b53b-bundle" (OuterVolumeSpecName: "bundle") pod "be85d851-fa19-4a82-b64c-80a9a261b53b" (UID: "be85d851-fa19-4a82-b64c-80a9a261b53b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:20:07 crc kubenswrapper[4867]: I0320 00:20:07.994924 4867 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/918d65ef-006f-4f3a-8736-6af79633c18b-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:07 crc kubenswrapper[4867]: I0320 00:20:07.994938 4867 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be85d851-fa19-4a82-b64c-80a9a261b53b-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:08 crc kubenswrapper[4867]: I0320 00:20:08.006430 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/918d65ef-006f-4f3a-8736-6af79633c18b-util" (OuterVolumeSpecName: "util") pod "918d65ef-006f-4f3a-8736-6af79633c18b" (UID: "918d65ef-006f-4f3a-8736-6af79633c18b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:20:08 crc kubenswrapper[4867]: I0320 00:20:08.006901 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be85d851-fa19-4a82-b64c-80a9a261b53b-kube-api-access-qbr5x" (OuterVolumeSpecName: "kube-api-access-qbr5x") pod "be85d851-fa19-4a82-b64c-80a9a261b53b" (UID: "be85d851-fa19-4a82-b64c-80a9a261b53b"). InnerVolumeSpecName "kube-api-access-qbr5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:20:08 crc kubenswrapper[4867]: I0320 00:20:08.007424 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/918d65ef-006f-4f3a-8736-6af79633c18b-kube-api-access-lrdpc" (OuterVolumeSpecName: "kube-api-access-lrdpc") pod "918d65ef-006f-4f3a-8736-6af79633c18b" (UID: "918d65ef-006f-4f3a-8736-6af79633c18b"). InnerVolumeSpecName "kube-api-access-lrdpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:20:08 crc kubenswrapper[4867]: I0320 00:20:08.025449 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be85d851-fa19-4a82-b64c-80a9a261b53b-util" (OuterVolumeSpecName: "util") pod "be85d851-fa19-4a82-b64c-80a9a261b53b" (UID: "be85d851-fa19-4a82-b64c-80a9a261b53b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:20:08 crc kubenswrapper[4867]: I0320 00:20:08.096226 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbr5x\" (UniqueName: \"kubernetes.io/projected/be85d851-fa19-4a82-b64c-80a9a261b53b-kube-api-access-qbr5x\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:08 crc kubenswrapper[4867]: I0320 00:20:08.096280 4867 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be85d851-fa19-4a82-b64c-80a9a261b53b-util\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:08 crc kubenswrapper[4867]: I0320 00:20:08.096293 4867 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/918d65ef-006f-4f3a-8736-6af79633c18b-util\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:08 crc kubenswrapper[4867]: I0320 00:20:08.096303 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrdpc\" (UniqueName: \"kubernetes.io/projected/918d65ef-006f-4f3a-8736-6af79633c18b-kube-api-access-lrdpc\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:08 crc kubenswrapper[4867]: I0320 00:20:08.384864 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj" event={"ID":"918d65ef-006f-4f3a-8736-6af79633c18b","Type":"ContainerDied","Data":"4d0064421a6e6fadaed2a370e577d0b620e2dc73269a875942384a7d471e2050"} Mar 20 00:20:08 crc kubenswrapper[4867]: I0320 00:20:08.384903 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d0064421a6e6fadaed2a370e577d0b620e2dc73269a875942384a7d471e2050" Mar 20 00:20:08 crc kubenswrapper[4867]: I0320 00:20:08.384927 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj" Mar 20 00:20:08 crc kubenswrapper[4867]: I0320 00:20:08.386792 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw" event={"ID":"be85d851-fa19-4a82-b64c-80a9a261b53b","Type":"ContainerDied","Data":"8c5175a4caf0084ed5b71dec086b47d9a9ae3a4e0e483b2143380e2b4f8d7b5c"} Mar 20 00:20:08 crc kubenswrapper[4867]: I0320 00:20:08.386809 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c5175a4caf0084ed5b71dec086b47d9a9ae3a4e0e483b2143380e2b4f8d7b5c" Mar 20 00:20:08 crc kubenswrapper[4867]: I0320 00:20:08.386935 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.398239 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8"] Mar 20 00:20:10 crc kubenswrapper[4867]: E0320 00:20:10.398712 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="918d65ef-006f-4f3a-8736-6af79633c18b" containerName="util" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.398723 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="918d65ef-006f-4f3a-8736-6af79633c18b" containerName="util" Mar 20 00:20:10 crc kubenswrapper[4867]: E0320 00:20:10.398730 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be85d851-fa19-4a82-b64c-80a9a261b53b" containerName="extract" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.398736 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="be85d851-fa19-4a82-b64c-80a9a261b53b" containerName="extract" Mar 20 00:20:10 crc kubenswrapper[4867]: E0320 00:20:10.398748 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="981c2ac7-a232-4a5c-a491-face5b8e5ad2" containerName="oc" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.398754 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="981c2ac7-a232-4a5c-a491-face5b8e5ad2" containerName="oc" Mar 20 00:20:10 crc kubenswrapper[4867]: E0320 00:20:10.398764 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="918d65ef-006f-4f3a-8736-6af79633c18b" containerName="pull" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.398770 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="918d65ef-006f-4f3a-8736-6af79633c18b" containerName="pull" Mar 20 00:20:10 crc kubenswrapper[4867]: E0320 00:20:10.398783 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="918d65ef-006f-4f3a-8736-6af79633c18b" containerName="extract" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.398788 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="918d65ef-006f-4f3a-8736-6af79633c18b" containerName="extract" Mar 20 00:20:10 crc kubenswrapper[4867]: E0320 00:20:10.398796 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be85d851-fa19-4a82-b64c-80a9a261b53b" containerName="pull" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.398802 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="be85d851-fa19-4a82-b64c-80a9a261b53b" containerName="pull" Mar 20 00:20:10 crc kubenswrapper[4867]: E0320 00:20:10.398812 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be85d851-fa19-4a82-b64c-80a9a261b53b" containerName="util" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.398818 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="be85d851-fa19-4a82-b64c-80a9a261b53b" containerName="util" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.398901 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="981c2ac7-a232-4a5c-a491-face5b8e5ad2" containerName="oc" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.398914 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="918d65ef-006f-4f3a-8736-6af79633c18b" containerName="extract" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.398921 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="be85d851-fa19-4a82-b64c-80a9a261b53b" containerName="extract" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.399788 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.401922 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.413500 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8"] Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.524999 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plsss\" (UniqueName: \"kubernetes.io/projected/9a1e5e65-8e08-44b2-aac0-2b446cd6d516-kube-api-access-plsss\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8\" (UID: \"9a1e5e65-8e08-44b2-aac0-2b446cd6d516\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.525254 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a1e5e65-8e08-44b2-aac0-2b446cd6d516-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8\" (UID: \"9a1e5e65-8e08-44b2-aac0-2b446cd6d516\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.525386 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a1e5e65-8e08-44b2-aac0-2b446cd6d516-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8\" (UID: \"9a1e5e65-8e08-44b2-aac0-2b446cd6d516\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.626518 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plsss\" (UniqueName: \"kubernetes.io/projected/9a1e5e65-8e08-44b2-aac0-2b446cd6d516-kube-api-access-plsss\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8\" (UID: \"9a1e5e65-8e08-44b2-aac0-2b446cd6d516\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.626571 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a1e5e65-8e08-44b2-aac0-2b446cd6d516-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8\" (UID: \"9a1e5e65-8e08-44b2-aac0-2b446cd6d516\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.626608 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a1e5e65-8e08-44b2-aac0-2b446cd6d516-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8\" (UID: \"9a1e5e65-8e08-44b2-aac0-2b446cd6d516\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.627034 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a1e5e65-8e08-44b2-aac0-2b446cd6d516-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8\" (UID: \"9a1e5e65-8e08-44b2-aac0-2b446cd6d516\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.627098 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a1e5e65-8e08-44b2-aac0-2b446cd6d516-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8\" (UID: \"9a1e5e65-8e08-44b2-aac0-2b446cd6d516\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.646316 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plsss\" (UniqueName: \"kubernetes.io/projected/9a1e5e65-8e08-44b2-aac0-2b446cd6d516-kube-api-access-plsss\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8\" (UID: \"9a1e5e65-8e08-44b2-aac0-2b446cd6d516\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8" Mar 20 00:20:10 crc kubenswrapper[4867]: I0320 00:20:10.712115 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8" Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.135412 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8"] Mar 20 00:20:11 crc kubenswrapper[4867]: W0320 00:20:11.146501 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a1e5e65_8e08_44b2_aac0_2b446cd6d516.slice/crio-369eda2f9a403b32ec144010cd37b78154c451c66dded9170fd39942ba486279 WatchSource:0}: Error finding container 369eda2f9a403b32ec144010cd37b78154c451c66dded9170fd39942ba486279: Status 404 returned error can't find the container with id 369eda2f9a403b32ec144010cd37b78154c451c66dded9170fd39942ba486279 Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.404149 4867 generic.go:334] "Generic (PLEG): container finished" podID="9a1e5e65-8e08-44b2-aac0-2b446cd6d516" containerID="619cd3f3ee613cb7ae2350d7dd4531ad467063a57d13fdb22c4591cfa02b69c7" exitCode=0 Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.404214 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8" event={"ID":"9a1e5e65-8e08-44b2-aac0-2b446cd6d516","Type":"ContainerDied","Data":"619cd3f3ee613cb7ae2350d7dd4531ad467063a57d13fdb22c4591cfa02b69c7"} Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.404252 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8" event={"ID":"9a1e5e65-8e08-44b2-aac0-2b446cd6d516","Type":"ContainerStarted","Data":"369eda2f9a403b32ec144010cd37b78154c451c66dded9170fd39942ba486279"} Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.551158 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vnj5f"] Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.552465 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnj5f" Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.578371 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vnj5f"] Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.739310 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv9dn\" (UniqueName: \"kubernetes.io/projected/5e17fc1b-5947-4b4c-8796-599a03301f8b-kube-api-access-tv9dn\") pod \"certified-operators-vnj5f\" (UID: \"5e17fc1b-5947-4b4c-8796-599a03301f8b\") " pod="openshift-marketplace/certified-operators-vnj5f" Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.739589 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e17fc1b-5947-4b4c-8796-599a03301f8b-utilities\") pod \"certified-operators-vnj5f\" (UID: \"5e17fc1b-5947-4b4c-8796-599a03301f8b\") " pod="openshift-marketplace/certified-operators-vnj5f" Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.739617 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e17fc1b-5947-4b4c-8796-599a03301f8b-catalog-content\") pod \"certified-operators-vnj5f\" (UID: \"5e17fc1b-5947-4b4c-8796-599a03301f8b\") " pod="openshift-marketplace/certified-operators-vnj5f" Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.841013 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv9dn\" (UniqueName: \"kubernetes.io/projected/5e17fc1b-5947-4b4c-8796-599a03301f8b-kube-api-access-tv9dn\") pod \"certified-operators-vnj5f\" (UID: \"5e17fc1b-5947-4b4c-8796-599a03301f8b\") " pod="openshift-marketplace/certified-operators-vnj5f" Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.841054 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e17fc1b-5947-4b4c-8796-599a03301f8b-utilities\") pod \"certified-operators-vnj5f\" (UID: \"5e17fc1b-5947-4b4c-8796-599a03301f8b\") " pod="openshift-marketplace/certified-operators-vnj5f" Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.841077 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e17fc1b-5947-4b4c-8796-599a03301f8b-catalog-content\") pod \"certified-operators-vnj5f\" (UID: \"5e17fc1b-5947-4b4c-8796-599a03301f8b\") " pod="openshift-marketplace/certified-operators-vnj5f" Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.841470 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e17fc1b-5947-4b4c-8796-599a03301f8b-catalog-content\") pod \"certified-operators-vnj5f\" (UID: \"5e17fc1b-5947-4b4c-8796-599a03301f8b\") " pod="openshift-marketplace/certified-operators-vnj5f" Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.841548 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e17fc1b-5947-4b4c-8796-599a03301f8b-utilities\") pod \"certified-operators-vnj5f\" (UID: \"5e17fc1b-5947-4b4c-8796-599a03301f8b\") " pod="openshift-marketplace/certified-operators-vnj5f" Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.869721 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv9dn\" (UniqueName: \"kubernetes.io/projected/5e17fc1b-5947-4b4c-8796-599a03301f8b-kube-api-access-tv9dn\") pod \"certified-operators-vnj5f\" (UID: \"5e17fc1b-5947-4b4c-8796-599a03301f8b\") " pod="openshift-marketplace/certified-operators-vnj5f" Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.872387 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnj5f" Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.959519 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-kzq86"] Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.960135 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-kzq86" Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.961613 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.961924 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.962170 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-wvccn" Mar 20 00:20:11 crc kubenswrapper[4867]: I0320 00:20:11.974357 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-kzq86"] Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.145718 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt8dp\" (UniqueName: \"kubernetes.io/projected/b8b87de0-04fd-464f-8357-df1bed1c47e9-kube-api-access-kt8dp\") pod \"obo-prometheus-operator-8ff7d675-kzq86\" (UID: \"b8b87de0-04fd-464f-8357-df1bed1c47e9\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-kzq86" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.246996 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt8dp\" (UniqueName: \"kubernetes.io/projected/b8b87de0-04fd-464f-8357-df1bed1c47e9-kube-api-access-kt8dp\") pod \"obo-prometheus-operator-8ff7d675-kzq86\" (UID: \"b8b87de0-04fd-464f-8357-df1bed1c47e9\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-kzq86" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.277329 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt8dp\" (UniqueName: \"kubernetes.io/projected/b8b87de0-04fd-464f-8357-df1bed1c47e9-kube-api-access-kt8dp\") pod \"obo-prometheus-operator-8ff7d675-kzq86\" (UID: \"b8b87de0-04fd-464f-8357-df1bed1c47e9\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-kzq86" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.281959 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm"] Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.286555 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.288720 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-n8s7l" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.289001 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.289657 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2"] Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.290289 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.299389 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm"] Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.312074 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2"] Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.428479 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vnj5f"] Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.449203 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ae3d5dfc-6422-4cea-81fe-b238e1e25562-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm\" (UID: \"ae3d5dfc-6422-4cea-81fe-b238e1e25562\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.449274 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ce0b3f6-ac02-4d16-a2f0-c26ebebe9a09-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2\" (UID: \"1ce0b3f6-ac02-4d16-a2f0-c26ebebe9a09\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.449297 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ce0b3f6-ac02-4d16-a2f0-c26ebebe9a09-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2\" (UID: \"1ce0b3f6-ac02-4d16-a2f0-c26ebebe9a09\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.449325 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ae3d5dfc-6422-4cea-81fe-b238e1e25562-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm\" (UID: \"ae3d5dfc-6422-4cea-81fe-b238e1e25562\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.550934 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ce0b3f6-ac02-4d16-a2f0-c26ebebe9a09-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2\" (UID: \"1ce0b3f6-ac02-4d16-a2f0-c26ebebe9a09\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.551448 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ce0b3f6-ac02-4d16-a2f0-c26ebebe9a09-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2\" (UID: \"1ce0b3f6-ac02-4d16-a2f0-c26ebebe9a09\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.551480 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ae3d5dfc-6422-4cea-81fe-b238e1e25562-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm\" (UID: \"ae3d5dfc-6422-4cea-81fe-b238e1e25562\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.551550 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ae3d5dfc-6422-4cea-81fe-b238e1e25562-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm\" (UID: \"ae3d5dfc-6422-4cea-81fe-b238e1e25562\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.558664 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ce0b3f6-ac02-4d16-a2f0-c26ebebe9a09-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2\" (UID: \"1ce0b3f6-ac02-4d16-a2f0-c26ebebe9a09\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.559252 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ce0b3f6-ac02-4d16-a2f0-c26ebebe9a09-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2\" (UID: \"1ce0b3f6-ac02-4d16-a2f0-c26ebebe9a09\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.559955 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ae3d5dfc-6422-4cea-81fe-b238e1e25562-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm\" (UID: \"ae3d5dfc-6422-4cea-81fe-b238e1e25562\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.563881 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ae3d5dfc-6422-4cea-81fe-b238e1e25562-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm\" (UID: \"ae3d5dfc-6422-4cea-81fe-b238e1e25562\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.574901 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-kzq86" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.613847 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.625696 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.674283 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-229gh"] Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.675577 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-229gh" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.678866 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.679056 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-zcf95" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.687137 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-229gh"] Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.856512 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwcrt\" (UniqueName: \"kubernetes.io/projected/a450f0d4-eeea-4cf5-ad11-be8bf9bd91ae-kube-api-access-bwcrt\") pod \"observability-operator-6dd7dd855f-229gh\" (UID: \"a450f0d4-eeea-4cf5-ad11-be8bf9bd91ae\") " pod="openshift-operators/observability-operator-6dd7dd855f-229gh" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.857636 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a450f0d4-eeea-4cf5-ad11-be8bf9bd91ae-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-229gh\" (UID: \"a450f0d4-eeea-4cf5-ad11-be8bf9bd91ae\") " pod="openshift-operators/observability-operator-6dd7dd855f-229gh" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.886512 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-kzq86"] Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.956473 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2"] Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.959154 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a450f0d4-eeea-4cf5-ad11-be8bf9bd91ae-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-229gh\" (UID: \"a450f0d4-eeea-4cf5-ad11-be8bf9bd91ae\") " pod="openshift-operators/observability-operator-6dd7dd855f-229gh" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.959221 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwcrt\" (UniqueName: \"kubernetes.io/projected/a450f0d4-eeea-4cf5-ad11-be8bf9bd91ae-kube-api-access-bwcrt\") pod \"observability-operator-6dd7dd855f-229gh\" (UID: \"a450f0d4-eeea-4cf5-ad11-be8bf9bd91ae\") " pod="openshift-operators/observability-operator-6dd7dd855f-229gh" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.982231 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a450f0d4-eeea-4cf5-ad11-be8bf9bd91ae-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-229gh\" (UID: \"a450f0d4-eeea-4cf5-ad11-be8bf9bd91ae\") " pod="openshift-operators/observability-operator-6dd7dd855f-229gh" Mar 20 00:20:12 crc kubenswrapper[4867]: I0320 00:20:12.989257 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwcrt\" (UniqueName: \"kubernetes.io/projected/a450f0d4-eeea-4cf5-ad11-be8bf9bd91ae-kube-api-access-bwcrt\") pod \"observability-operator-6dd7dd855f-229gh\" (UID: \"a450f0d4-eeea-4cf5-ad11-be8bf9bd91ae\") " pod="openshift-operators/observability-operator-6dd7dd855f-229gh" Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.018020 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-229gh" Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.029811 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-59494d8978-x6l52"] Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.034025 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-59494d8978-x6l52" Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.038549 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.038589 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-75kvv" Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.049712 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-59494d8978-x6l52"] Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.161204 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zkmb\" (UniqueName: \"kubernetes.io/projected/2a6984f3-b144-46cf-b343-6d8a75ebb8d7-kube-api-access-7zkmb\") pod \"perses-operator-59494d8978-x6l52\" (UID: \"2a6984f3-b144-46cf-b343-6d8a75ebb8d7\") " pod="openshift-operators/perses-operator-59494d8978-x6l52" Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.161269 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a6984f3-b144-46cf-b343-6d8a75ebb8d7-webhook-cert\") pod \"perses-operator-59494d8978-x6l52\" (UID: \"2a6984f3-b144-46cf-b343-6d8a75ebb8d7\") " pod="openshift-operators/perses-operator-59494d8978-x6l52" Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.161512 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a6984f3-b144-46cf-b343-6d8a75ebb8d7-apiservice-cert\") pod \"perses-operator-59494d8978-x6l52\" (UID: \"2a6984f3-b144-46cf-b343-6d8a75ebb8d7\") " pod="openshift-operators/perses-operator-59494d8978-x6l52" Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.161601 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a6984f3-b144-46cf-b343-6d8a75ebb8d7-openshift-service-ca\") pod \"perses-operator-59494d8978-x6l52\" (UID: \"2a6984f3-b144-46cf-b343-6d8a75ebb8d7\") " pod="openshift-operators/perses-operator-59494d8978-x6l52" Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.199189 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm"] Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.263274 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a6984f3-b144-46cf-b343-6d8a75ebb8d7-webhook-cert\") pod \"perses-operator-59494d8978-x6l52\" (UID: \"2a6984f3-b144-46cf-b343-6d8a75ebb8d7\") " pod="openshift-operators/perses-operator-59494d8978-x6l52" Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.263342 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a6984f3-b144-46cf-b343-6d8a75ebb8d7-apiservice-cert\") pod \"perses-operator-59494d8978-x6l52\" (UID: \"2a6984f3-b144-46cf-b343-6d8a75ebb8d7\") " pod="openshift-operators/perses-operator-59494d8978-x6l52" Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.263374 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a6984f3-b144-46cf-b343-6d8a75ebb8d7-openshift-service-ca\") pod \"perses-operator-59494d8978-x6l52\" (UID: \"2a6984f3-b144-46cf-b343-6d8a75ebb8d7\") " pod="openshift-operators/perses-operator-59494d8978-x6l52" Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.263415 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zkmb\" (UniqueName: \"kubernetes.io/projected/2a6984f3-b144-46cf-b343-6d8a75ebb8d7-kube-api-access-7zkmb\") pod \"perses-operator-59494d8978-x6l52\" (UID: \"2a6984f3-b144-46cf-b343-6d8a75ebb8d7\") " pod="openshift-operators/perses-operator-59494d8978-x6l52" Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.267011 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a6984f3-b144-46cf-b343-6d8a75ebb8d7-webhook-cert\") pod \"perses-operator-59494d8978-x6l52\" (UID: \"2a6984f3-b144-46cf-b343-6d8a75ebb8d7\") " pod="openshift-operators/perses-operator-59494d8978-x6l52" Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.267665 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a6984f3-b144-46cf-b343-6d8a75ebb8d7-openshift-service-ca\") pod \"perses-operator-59494d8978-x6l52\" (UID: \"2a6984f3-b144-46cf-b343-6d8a75ebb8d7\") " pod="openshift-operators/perses-operator-59494d8978-x6l52" Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.281749 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zkmb\" (UniqueName: \"kubernetes.io/projected/2a6984f3-b144-46cf-b343-6d8a75ebb8d7-kube-api-access-7zkmb\") pod \"perses-operator-59494d8978-x6l52\" (UID: \"2a6984f3-b144-46cf-b343-6d8a75ebb8d7\") " pod="openshift-operators/perses-operator-59494d8978-x6l52" Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.283006 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a6984f3-b144-46cf-b343-6d8a75ebb8d7-apiservice-cert\") pod \"perses-operator-59494d8978-x6l52\" (UID: \"2a6984f3-b144-46cf-b343-6d8a75ebb8d7\") " pod="openshift-operators/perses-operator-59494d8978-x6l52" Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.310873 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-229gh"] Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.351845 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-59494d8978-x6l52" Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.418162 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-229gh" event={"ID":"a450f0d4-eeea-4cf5-ad11-be8bf9bd91ae","Type":"ContainerStarted","Data":"0c8ba2a45a3710974c555b0edef2eb7286f1c456a0a4bf8de8fd15123325e7ab"} Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.419218 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-kzq86" event={"ID":"b8b87de0-04fd-464f-8357-df1bed1c47e9","Type":"ContainerStarted","Data":"22b28be1587b7fcd087b5369b98ea2d5fb04e3c2a49fa779ab7ecb69588a3b4b"} Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.419908 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2" event={"ID":"1ce0b3f6-ac02-4d16-a2f0-c26ebebe9a09","Type":"ContainerStarted","Data":"a30a7e3cd80cb30468219496c73988d67c2f7e2f738409630105d407f3f1b5df"} Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.420985 4867 generic.go:334] "Generic (PLEG): container finished" podID="5e17fc1b-5947-4b4c-8796-599a03301f8b" containerID="96bbad4c5993ce160da361c7aea4d16f3daa6a384ea7054b9c3c0c6fb1bb9364" exitCode=0 Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.423463 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnj5f" event={"ID":"5e17fc1b-5947-4b4c-8796-599a03301f8b","Type":"ContainerDied","Data":"96bbad4c5993ce160da361c7aea4d16f3daa6a384ea7054b9c3c0c6fb1bb9364"} Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.423526 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnj5f" event={"ID":"5e17fc1b-5947-4b4c-8796-599a03301f8b","Type":"ContainerStarted","Data":"9f703220c07a83d060b44021667df7b3af45be97d548dc40747d29d823e6876b"} Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.432703 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm" event={"ID":"ae3d5dfc-6422-4cea-81fe-b238e1e25562","Type":"ContainerStarted","Data":"c751830cb06cb715514dc1f1288f89a8122575b3a02a34362b841b0f3a97c8b4"} Mar 20 00:20:13 crc kubenswrapper[4867]: I0320 00:20:13.668173 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-59494d8978-x6l52"] Mar 20 00:20:13 crc kubenswrapper[4867]: W0320 00:20:13.684209 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a6984f3_b144_46cf_b343_6d8a75ebb8d7.slice/crio-cf3a45bf5e455c2464e99a251e3b85d358d02c6f9fa036b1d5f666d0a1c5bc5d WatchSource:0}: Error finding container cf3a45bf5e455c2464e99a251e3b85d358d02c6f9fa036b1d5f666d0a1c5bc5d: Status 404 returned error can't find the container with id cf3a45bf5e455c2464e99a251e3b85d358d02c6f9fa036b1d5f666d0a1c5bc5d Mar 20 00:20:14 crc kubenswrapper[4867]: I0320 00:20:14.442385 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnj5f" event={"ID":"5e17fc1b-5947-4b4c-8796-599a03301f8b","Type":"ContainerStarted","Data":"d02785415e5c5e9cc73ac81e3be72cab0a02e775e0f910c7845d4a42007265fe"} Mar 20 00:20:14 crc kubenswrapper[4867]: I0320 00:20:14.446128 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-59494d8978-x6l52" event={"ID":"2a6984f3-b144-46cf-b343-6d8a75ebb8d7","Type":"ContainerStarted","Data":"cf3a45bf5e455c2464e99a251e3b85d358d02c6f9fa036b1d5f666d0a1c5bc5d"} Mar 20 00:20:15 crc kubenswrapper[4867]: I0320 00:20:15.457477 4867 generic.go:334] "Generic (PLEG): container finished" podID="5e17fc1b-5947-4b4c-8796-599a03301f8b" containerID="d02785415e5c5e9cc73ac81e3be72cab0a02e775e0f910c7845d4a42007265fe" exitCode=0 Mar 20 00:20:15 crc kubenswrapper[4867]: I0320 00:20:15.457535 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnj5f" event={"ID":"5e17fc1b-5947-4b4c-8796-599a03301f8b","Type":"ContainerDied","Data":"d02785415e5c5e9cc73ac81e3be72cab0a02e775e0f910c7845d4a42007265fe"} Mar 20 00:20:15 crc kubenswrapper[4867]: I0320 00:20:15.794572 4867 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 00:20:15 crc kubenswrapper[4867]: I0320 00:20:15.990592 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rphtl"] Mar 20 00:20:15 crc kubenswrapper[4867]: I0320 00:20:15.992007 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rphtl" Mar 20 00:20:16 crc kubenswrapper[4867]: I0320 00:20:16.040938 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rphtl"] Mar 20 00:20:16 crc kubenswrapper[4867]: I0320 00:20:16.133868 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z9qr\" (UniqueName: \"kubernetes.io/projected/1855dda1-516d-48b8-ada8-1261bfd820e3-kube-api-access-8z9qr\") pod \"redhat-operators-rphtl\" (UID: \"1855dda1-516d-48b8-ada8-1261bfd820e3\") " pod="openshift-marketplace/redhat-operators-rphtl" Mar 20 00:20:16 crc kubenswrapper[4867]: I0320 00:20:16.133917 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1855dda1-516d-48b8-ada8-1261bfd820e3-catalog-content\") pod \"redhat-operators-rphtl\" (UID: \"1855dda1-516d-48b8-ada8-1261bfd820e3\") " pod="openshift-marketplace/redhat-operators-rphtl" Mar 20 00:20:16 crc kubenswrapper[4867]: I0320 00:20:16.133947 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1855dda1-516d-48b8-ada8-1261bfd820e3-utilities\") pod \"redhat-operators-rphtl\" (UID: \"1855dda1-516d-48b8-ada8-1261bfd820e3\") " pod="openshift-marketplace/redhat-operators-rphtl" Mar 20 00:20:16 crc kubenswrapper[4867]: I0320 00:20:16.234749 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1855dda1-516d-48b8-ada8-1261bfd820e3-catalog-content\") pod \"redhat-operators-rphtl\" (UID: \"1855dda1-516d-48b8-ada8-1261bfd820e3\") " pod="openshift-marketplace/redhat-operators-rphtl" Mar 20 00:20:16 crc kubenswrapper[4867]: I0320 00:20:16.234817 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1855dda1-516d-48b8-ada8-1261bfd820e3-utilities\") pod \"redhat-operators-rphtl\" (UID: \"1855dda1-516d-48b8-ada8-1261bfd820e3\") " pod="openshift-marketplace/redhat-operators-rphtl" Mar 20 00:20:16 crc kubenswrapper[4867]: I0320 00:20:16.234900 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z9qr\" (UniqueName: \"kubernetes.io/projected/1855dda1-516d-48b8-ada8-1261bfd820e3-kube-api-access-8z9qr\") pod \"redhat-operators-rphtl\" (UID: \"1855dda1-516d-48b8-ada8-1261bfd820e3\") " pod="openshift-marketplace/redhat-operators-rphtl" Mar 20 00:20:16 crc kubenswrapper[4867]: I0320 00:20:16.235796 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1855dda1-516d-48b8-ada8-1261bfd820e3-catalog-content\") pod \"redhat-operators-rphtl\" (UID: \"1855dda1-516d-48b8-ada8-1261bfd820e3\") " pod="openshift-marketplace/redhat-operators-rphtl" Mar 20 00:20:16 crc kubenswrapper[4867]: I0320 00:20:16.236074 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1855dda1-516d-48b8-ada8-1261bfd820e3-utilities\") pod \"redhat-operators-rphtl\" (UID: \"1855dda1-516d-48b8-ada8-1261bfd820e3\") " pod="openshift-marketplace/redhat-operators-rphtl" Mar 20 00:20:16 crc kubenswrapper[4867]: I0320 00:20:16.272233 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z9qr\" (UniqueName: \"kubernetes.io/projected/1855dda1-516d-48b8-ada8-1261bfd820e3-kube-api-access-8z9qr\") pod \"redhat-operators-rphtl\" (UID: \"1855dda1-516d-48b8-ada8-1261bfd820e3\") " pod="openshift-marketplace/redhat-operators-rphtl" Mar 20 00:20:16 crc kubenswrapper[4867]: I0320 00:20:16.312749 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rphtl" Mar 20 00:20:19 crc kubenswrapper[4867]: I0320 00:20:19.179535 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-5dc476cd58-hsfm2"] Mar 20 00:20:19 crc kubenswrapper[4867]: I0320 00:20:19.184558 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-5dc476cd58-hsfm2" Mar 20 00:20:19 crc kubenswrapper[4867]: I0320 00:20:19.185601 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0059668d-6e6e-48af-a02b-6f46100cf896-webhook-cert\") pod \"elastic-operator-5dc476cd58-hsfm2\" (UID: \"0059668d-6e6e-48af-a02b-6f46100cf896\") " pod="service-telemetry/elastic-operator-5dc476cd58-hsfm2" Mar 20 00:20:19 crc kubenswrapper[4867]: I0320 00:20:19.185674 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0059668d-6e6e-48af-a02b-6f46100cf896-apiservice-cert\") pod \"elastic-operator-5dc476cd58-hsfm2\" (UID: \"0059668d-6e6e-48af-a02b-6f46100cf896\") " pod="service-telemetry/elastic-operator-5dc476cd58-hsfm2" Mar 20 00:20:19 crc kubenswrapper[4867]: I0320 00:20:19.185723 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qhr4\" (UniqueName: \"kubernetes.io/projected/0059668d-6e6e-48af-a02b-6f46100cf896-kube-api-access-5qhr4\") pod \"elastic-operator-5dc476cd58-hsfm2\" (UID: \"0059668d-6e6e-48af-a02b-6f46100cf896\") " pod="service-telemetry/elastic-operator-5dc476cd58-hsfm2" Mar 20 00:20:19 crc kubenswrapper[4867]: I0320 00:20:19.192760 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Mar 20 00:20:19 crc kubenswrapper[4867]: I0320 00:20:19.193235 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Mar 20 00:20:19 crc kubenswrapper[4867]: I0320 00:20:19.193250 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-cwldh" Mar 20 00:20:19 crc kubenswrapper[4867]: I0320 00:20:19.193268 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Mar 20 00:20:19 crc kubenswrapper[4867]: I0320 00:20:19.196673 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-5dc476cd58-hsfm2"] Mar 20 00:20:19 crc kubenswrapper[4867]: I0320 00:20:19.286620 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qhr4\" (UniqueName: \"kubernetes.io/projected/0059668d-6e6e-48af-a02b-6f46100cf896-kube-api-access-5qhr4\") pod \"elastic-operator-5dc476cd58-hsfm2\" (UID: \"0059668d-6e6e-48af-a02b-6f46100cf896\") " pod="service-telemetry/elastic-operator-5dc476cd58-hsfm2" Mar 20 00:20:19 crc kubenswrapper[4867]: I0320 00:20:19.286671 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0059668d-6e6e-48af-a02b-6f46100cf896-webhook-cert\") pod \"elastic-operator-5dc476cd58-hsfm2\" (UID: \"0059668d-6e6e-48af-a02b-6f46100cf896\") " pod="service-telemetry/elastic-operator-5dc476cd58-hsfm2" Mar 20 00:20:19 crc kubenswrapper[4867]: I0320 00:20:19.286720 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0059668d-6e6e-48af-a02b-6f46100cf896-apiservice-cert\") pod \"elastic-operator-5dc476cd58-hsfm2\" (UID: \"0059668d-6e6e-48af-a02b-6f46100cf896\") " pod="service-telemetry/elastic-operator-5dc476cd58-hsfm2" Mar 20 00:20:19 crc kubenswrapper[4867]: I0320 00:20:19.303462 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0059668d-6e6e-48af-a02b-6f46100cf896-webhook-cert\") pod \"elastic-operator-5dc476cd58-hsfm2\" (UID: \"0059668d-6e6e-48af-a02b-6f46100cf896\") " pod="service-telemetry/elastic-operator-5dc476cd58-hsfm2" Mar 20 00:20:19 crc kubenswrapper[4867]: I0320 00:20:19.307873 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qhr4\" (UniqueName: \"kubernetes.io/projected/0059668d-6e6e-48af-a02b-6f46100cf896-kube-api-access-5qhr4\") pod \"elastic-operator-5dc476cd58-hsfm2\" (UID: \"0059668d-6e6e-48af-a02b-6f46100cf896\") " pod="service-telemetry/elastic-operator-5dc476cd58-hsfm2" Mar 20 00:20:19 crc kubenswrapper[4867]: I0320 00:20:19.317771 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0059668d-6e6e-48af-a02b-6f46100cf896-apiservice-cert\") pod \"elastic-operator-5dc476cd58-hsfm2\" (UID: \"0059668d-6e6e-48af-a02b-6f46100cf896\") " pod="service-telemetry/elastic-operator-5dc476cd58-hsfm2" Mar 20 00:20:19 crc kubenswrapper[4867]: I0320 00:20:19.505911 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-5dc476cd58-hsfm2" Mar 20 00:20:23 crc kubenswrapper[4867]: I0320 00:20:23.705966 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-hv6bm"] Mar 20 00:20:23 crc kubenswrapper[4867]: I0320 00:20:23.706838 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-hv6bm" Mar 20 00:20:23 crc kubenswrapper[4867]: I0320 00:20:23.711436 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-9rgm8" Mar 20 00:20:23 crc kubenswrapper[4867]: I0320 00:20:23.721262 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-hv6bm"] Mar 20 00:20:23 crc kubenswrapper[4867]: I0320 00:20:23.769621 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wgjj\" (UniqueName: \"kubernetes.io/projected/4c6c5309-a9c3-456c-b6d6-2af53b15791a-kube-api-access-4wgjj\") pod \"interconnect-operator-5bb49f789d-hv6bm\" (UID: \"4c6c5309-a9c3-456c-b6d6-2af53b15791a\") " pod="service-telemetry/interconnect-operator-5bb49f789d-hv6bm" Mar 20 00:20:23 crc kubenswrapper[4867]: I0320 00:20:23.870479 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wgjj\" (UniqueName: \"kubernetes.io/projected/4c6c5309-a9c3-456c-b6d6-2af53b15791a-kube-api-access-4wgjj\") pod \"interconnect-operator-5bb49f789d-hv6bm\" (UID: \"4c6c5309-a9c3-456c-b6d6-2af53b15791a\") " pod="service-telemetry/interconnect-operator-5bb49f789d-hv6bm" Mar 20 00:20:23 crc kubenswrapper[4867]: I0320 00:20:23.905349 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wgjj\" (UniqueName: \"kubernetes.io/projected/4c6c5309-a9c3-456c-b6d6-2af53b15791a-kube-api-access-4wgjj\") pod \"interconnect-operator-5bb49f789d-hv6bm\" (UID: \"4c6c5309-a9c3-456c-b6d6-2af53b15791a\") " pod="service-telemetry/interconnect-operator-5bb49f789d-hv6bm" Mar 20 00:20:24 crc kubenswrapper[4867]: I0320 00:20:24.095127 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-hv6bm" Mar 20 00:20:28 crc kubenswrapper[4867]: E0320 00:20:28.524706 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:fb1030480e5a55ead0d9748615a2e4b9228522f14b77a782f44407883c24ba93" Mar 20 00:20:28 crc kubenswrapper[4867]: E0320 00:20:28.525677 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:fb1030480e5a55ead0d9748615a2e4b9228522f14b77a782f44407883c24ba93,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.4.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2_openshift-operators(1ce0b3f6-ac02-4d16-a2f0-c26ebebe9a09): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 00:20:28 crc kubenswrapper[4867]: E0320 00:20:28.528692 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2" podUID="1ce0b3f6-ac02-4d16-a2f0-c26ebebe9a09" Mar 20 00:20:28 crc kubenswrapper[4867]: I0320 00:20:28.808392 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-5dc476cd58-hsfm2"] Mar 20 00:20:28 crc kubenswrapper[4867]: I0320 00:20:28.863584 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rphtl"] Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.098316 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-hv6bm"] Mar 20 00:20:29 crc kubenswrapper[4867]: W0320 00:20:29.102865 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c6c5309_a9c3_456c_b6d6_2af53b15791a.slice/crio-93dfbd975fd9ee8770a273b8d15a75534f401e2c67bb081d151d5e41188e1287 WatchSource:0}: Error finding container 93dfbd975fd9ee8770a273b8d15a75534f401e2c67bb081d151d5e41188e1287: Status 404 returned error can't find the container with id 93dfbd975fd9ee8770a273b8d15a75534f401e2c67bb081d151d5e41188e1287 Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.577263 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnj5f" event={"ID":"5e17fc1b-5947-4b4c-8796-599a03301f8b","Type":"ContainerStarted","Data":"76ee174d81cee9285ebc39a5406195a91be4c9ad8e4e679b32843c07a4653e48"} Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.578534 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-5dc476cd58-hsfm2" event={"ID":"0059668d-6e6e-48af-a02b-6f46100cf896","Type":"ContainerStarted","Data":"951bac0c0773886234f698b426fe33fbbf4f9947efae697f4a225da77700ced0"} Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.581373 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-229gh" event={"ID":"a450f0d4-eeea-4cf5-ad11-be8bf9bd91ae","Type":"ContainerStarted","Data":"8ad4359f348f7ac1d3c3552f8b2749f80fd79e1275dfb22124a0f323183394f0"} Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.581762 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-229gh" Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.588001 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-kzq86" event={"ID":"b8b87de0-04fd-464f-8357-df1bed1c47e9","Type":"ContainerStarted","Data":"4748876f43f9f5f6236e0f2f2f62b9bb2f34a32e45c35c770a39d0949decee77"} Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.590417 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2" event={"ID":"1ce0b3f6-ac02-4d16-a2f0-c26ebebe9a09","Type":"ContainerStarted","Data":"4ea80db2889b0a83389f796d957502ab500ce8d2ce5ff2c19081843f38592bf2"} Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.597113 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-229gh" Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.597372 4867 generic.go:334] "Generic (PLEG): container finished" podID="1855dda1-516d-48b8-ada8-1261bfd820e3" containerID="4d6f6082eeb264c604e987c5d500690baa5a2852191e5789af694405999872d2" exitCode=0 Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.597430 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rphtl" event={"ID":"1855dda1-516d-48b8-ada8-1261bfd820e3","Type":"ContainerDied","Data":"4d6f6082eeb264c604e987c5d500690baa5a2852191e5789af694405999872d2"} Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.597448 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rphtl" event={"ID":"1855dda1-516d-48b8-ada8-1261bfd820e3","Type":"ContainerStarted","Data":"6d50cd320959900f5924953e09a2b8abe497682fc6f43b5978c18393c7a009cf"} Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.601006 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vnj5f" podStartSLOduration=11.740214848 podStartE2EDuration="18.600986139s" podCreationTimestamp="2026-03-20 00:20:11 +0000 UTC" firstStartedPulling="2026-03-20 00:20:13.426532357 +0000 UTC m=+827.653069874" lastFinishedPulling="2026-03-20 00:20:20.287303648 +0000 UTC m=+834.513841165" observedRunningTime="2026-03-20 00:20:29.595942088 +0000 UTC m=+843.822479605" watchObservedRunningTime="2026-03-20 00:20:29.600986139 +0000 UTC m=+843.827523656" Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.602240 4867 generic.go:334] "Generic (PLEG): container finished" podID="9a1e5e65-8e08-44b2-aac0-2b446cd6d516" containerID="c41f355d0a6d22fd1684a973624acac8d07190235cd066aa22b3c9f44bbab8da" exitCode=0 Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.602306 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8" event={"ID":"9a1e5e65-8e08-44b2-aac0-2b446cd6d516","Type":"ContainerDied","Data":"c41f355d0a6d22fd1684a973624acac8d07190235cd066aa22b3c9f44bbab8da"} Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.607416 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-59494d8978-x6l52" event={"ID":"2a6984f3-b144-46cf-b343-6d8a75ebb8d7","Type":"ContainerStarted","Data":"6802c2cefd1e0a84635f933291b9b19cd3f8a183f0a4f05374e77e6e9eddaa89"} Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.607569 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-59494d8978-x6l52" Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.610274 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm" event={"ID":"ae3d5dfc-6422-4cea-81fe-b238e1e25562","Type":"ContainerStarted","Data":"a8cee78d0766b23d2525782da5f9bd0f805583db6d91eb854492d1bde85a3fea"} Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.617556 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-kzq86" podStartSLOduration=2.9709188490000003 podStartE2EDuration="18.617539649s" podCreationTimestamp="2026-03-20 00:20:11 +0000 UTC" firstStartedPulling="2026-03-20 00:20:12.948006236 +0000 UTC m=+827.174543753" lastFinishedPulling="2026-03-20 00:20:28.594627036 +0000 UTC m=+842.821164553" observedRunningTime="2026-03-20 00:20:29.615922617 +0000 UTC m=+843.842460134" watchObservedRunningTime="2026-03-20 00:20:29.617539649 +0000 UTC m=+843.844077166" Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.617729 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-hv6bm" event={"ID":"4c6c5309-a9c3-456c-b6d6-2af53b15791a","Type":"ContainerStarted","Data":"93dfbd975fd9ee8770a273b8d15a75534f401e2c67bb081d151d5e41188e1287"} Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.645523 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-229gh" podStartSLOduration=2.365118818 podStartE2EDuration="17.645505026s" podCreationTimestamp="2026-03-20 00:20:12 +0000 UTC" firstStartedPulling="2026-03-20 00:20:13.342602975 +0000 UTC m=+827.569140492" lastFinishedPulling="2026-03-20 00:20:28.622989173 +0000 UTC m=+842.849526700" observedRunningTime="2026-03-20 00:20:29.645381823 +0000 UTC m=+843.871919350" watchObservedRunningTime="2026-03-20 00:20:29.645505026 +0000 UTC m=+843.872042533" Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.675917 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2" podStartSLOduration=-9223372019.178877 podStartE2EDuration="17.675898926s" podCreationTimestamp="2026-03-20 00:20:12 +0000 UTC" firstStartedPulling="2026-03-20 00:20:12.970741017 +0000 UTC m=+827.197278534" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:20:29.672443686 +0000 UTC m=+843.898981213" watchObservedRunningTime="2026-03-20 00:20:29.675898926 +0000 UTC m=+843.902436443" Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.713207 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-59494d8978-x6l52" podStartSLOduration=1.8977038529999999 podStartE2EDuration="16.713190456s" podCreationTimestamp="2026-03-20 00:20:13 +0000 UTC" firstStartedPulling="2026-03-20 00:20:13.686361751 +0000 UTC m=+827.912899268" lastFinishedPulling="2026-03-20 00:20:28.501848344 +0000 UTC m=+842.728385871" observedRunningTime="2026-03-20 00:20:29.712915709 +0000 UTC m=+843.939453226" watchObservedRunningTime="2026-03-20 00:20:29.713190456 +0000 UTC m=+843.939727973" Mar 20 00:20:29 crc kubenswrapper[4867]: I0320 00:20:29.757669 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm" podStartSLOduration=2.4010301419999998 podStartE2EDuration="17.757651882s" podCreationTimestamp="2026-03-20 00:20:12 +0000 UTC" firstStartedPulling="2026-03-20 00:20:13.234668409 +0000 UTC m=+827.461205926" lastFinishedPulling="2026-03-20 00:20:28.591290149 +0000 UTC m=+842.817827666" observedRunningTime="2026-03-20 00:20:29.756722827 +0000 UTC m=+843.983260344" watchObservedRunningTime="2026-03-20 00:20:29.757651882 +0000 UTC m=+843.984189399" Mar 20 00:20:30 crc kubenswrapper[4867]: I0320 00:20:30.627330 4867 generic.go:334] "Generic (PLEG): container finished" podID="9a1e5e65-8e08-44b2-aac0-2b446cd6d516" containerID="c8803ceb2a52568c5e421f8d90c66322071dcad646de59be9c1b9740d6acdb3a" exitCode=0 Mar 20 00:20:30 crc kubenswrapper[4867]: I0320 00:20:30.627504 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8" event={"ID":"9a1e5e65-8e08-44b2-aac0-2b446cd6d516","Type":"ContainerDied","Data":"c8803ceb2a52568c5e421f8d90c66322071dcad646de59be9c1b9740d6acdb3a"} Mar 20 00:20:31 crc kubenswrapper[4867]: I0320 00:20:31.873029 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vnj5f" Mar 20 00:20:31 crc kubenswrapper[4867]: I0320 00:20:31.873270 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vnj5f" Mar 20 00:20:31 crc kubenswrapper[4867]: I0320 00:20:31.928677 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vnj5f" Mar 20 00:20:32 crc kubenswrapper[4867]: I0320 00:20:32.410466 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8" Mar 20 00:20:32 crc kubenswrapper[4867]: I0320 00:20:32.520094 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plsss\" (UniqueName: \"kubernetes.io/projected/9a1e5e65-8e08-44b2-aac0-2b446cd6d516-kube-api-access-plsss\") pod \"9a1e5e65-8e08-44b2-aac0-2b446cd6d516\" (UID: \"9a1e5e65-8e08-44b2-aac0-2b446cd6d516\") " Mar 20 00:20:32 crc kubenswrapper[4867]: I0320 00:20:32.520177 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a1e5e65-8e08-44b2-aac0-2b446cd6d516-bundle\") pod \"9a1e5e65-8e08-44b2-aac0-2b446cd6d516\" (UID: \"9a1e5e65-8e08-44b2-aac0-2b446cd6d516\") " Mar 20 00:20:32 crc kubenswrapper[4867]: I0320 00:20:32.520249 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a1e5e65-8e08-44b2-aac0-2b446cd6d516-util\") pod \"9a1e5e65-8e08-44b2-aac0-2b446cd6d516\" (UID: \"9a1e5e65-8e08-44b2-aac0-2b446cd6d516\") " Mar 20 00:20:32 crc kubenswrapper[4867]: I0320 00:20:32.521104 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a1e5e65-8e08-44b2-aac0-2b446cd6d516-bundle" (OuterVolumeSpecName: "bundle") pod "9a1e5e65-8e08-44b2-aac0-2b446cd6d516" (UID: "9a1e5e65-8e08-44b2-aac0-2b446cd6d516"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:20:32 crc kubenswrapper[4867]: I0320 00:20:32.529747 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a1e5e65-8e08-44b2-aac0-2b446cd6d516-util" (OuterVolumeSpecName: "util") pod "9a1e5e65-8e08-44b2-aac0-2b446cd6d516" (UID: "9a1e5e65-8e08-44b2-aac0-2b446cd6d516"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:20:32 crc kubenswrapper[4867]: I0320 00:20:32.532173 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a1e5e65-8e08-44b2-aac0-2b446cd6d516-kube-api-access-plsss" (OuterVolumeSpecName: "kube-api-access-plsss") pod "9a1e5e65-8e08-44b2-aac0-2b446cd6d516" (UID: "9a1e5e65-8e08-44b2-aac0-2b446cd6d516"). InnerVolumeSpecName "kube-api-access-plsss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:20:32 crc kubenswrapper[4867]: I0320 00:20:32.621681 4867 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a1e5e65-8e08-44b2-aac0-2b446cd6d516-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:32 crc kubenswrapper[4867]: I0320 00:20:32.621710 4867 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a1e5e65-8e08-44b2-aac0-2b446cd6d516-util\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:32 crc kubenswrapper[4867]: I0320 00:20:32.621719 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plsss\" (UniqueName: \"kubernetes.io/projected/9a1e5e65-8e08-44b2-aac0-2b446cd6d516-kube-api-access-plsss\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:32 crc kubenswrapper[4867]: I0320 00:20:32.641114 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rphtl" event={"ID":"1855dda1-516d-48b8-ada8-1261bfd820e3","Type":"ContainerStarted","Data":"df095c9f496e89496cf354db46ab13a9388f7633dfe6682b4d5e3f13ea4b0f61"} Mar 20 00:20:32 crc kubenswrapper[4867]: I0320 00:20:32.643461 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8" event={"ID":"9a1e5e65-8e08-44b2-aac0-2b446cd6d516","Type":"ContainerDied","Data":"369eda2f9a403b32ec144010cd37b78154c451c66dded9170fd39942ba486279"} Mar 20 00:20:32 crc kubenswrapper[4867]: I0320 00:20:32.643499 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="369eda2f9a403b32ec144010cd37b78154c451c66dded9170fd39942ba486279" Mar 20 00:20:32 crc kubenswrapper[4867]: I0320 00:20:32.643517 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8" Mar 20 00:20:32 crc kubenswrapper[4867]: I0320 00:20:32.645052 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-5dc476cd58-hsfm2" event={"ID":"0059668d-6e6e-48af-a02b-6f46100cf896","Type":"ContainerStarted","Data":"004ff86deaee2068749121f8562e9254a6921770fb90202be7bd13c30eb4e08e"} Mar 20 00:20:32 crc kubenswrapper[4867]: I0320 00:20:32.692804 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-5dc476cd58-hsfm2" podStartSLOduration=10.13757539 podStartE2EDuration="13.692790058s" podCreationTimestamp="2026-03-20 00:20:19 +0000 UTC" firstStartedPulling="2026-03-20 00:20:28.821131204 +0000 UTC m=+843.047668721" lastFinishedPulling="2026-03-20 00:20:32.376345872 +0000 UTC m=+846.602883389" observedRunningTime="2026-03-20 00:20:32.689032121 +0000 UTC m=+846.915569648" watchObservedRunningTime="2026-03-20 00:20:32.692790058 +0000 UTC m=+846.919327575" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.356661 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-59494d8978-x6l52" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.656816 4867 generic.go:334] "Generic (PLEG): container finished" podID="1855dda1-516d-48b8-ada8-1261bfd820e3" containerID="df095c9f496e89496cf354db46ab13a9388f7633dfe6682b4d5e3f13ea4b0f61" exitCode=0 Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.656952 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rphtl" event={"ID":"1855dda1-516d-48b8-ada8-1261bfd820e3","Type":"ContainerDied","Data":"df095c9f496e89496cf354db46ab13a9388f7633dfe6682b4d5e3f13ea4b0f61"} Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.878812 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 20 00:20:33 crc kubenswrapper[4867]: E0320 00:20:33.879062 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a1e5e65-8e08-44b2-aac0-2b446cd6d516" containerName="pull" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.879083 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a1e5e65-8e08-44b2-aac0-2b446cd6d516" containerName="pull" Mar 20 00:20:33 crc kubenswrapper[4867]: E0320 00:20:33.879099 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a1e5e65-8e08-44b2-aac0-2b446cd6d516" containerName="extract" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.879107 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a1e5e65-8e08-44b2-aac0-2b446cd6d516" containerName="extract" Mar 20 00:20:33 crc kubenswrapper[4867]: E0320 00:20:33.879119 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a1e5e65-8e08-44b2-aac0-2b446cd6d516" containerName="util" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.879127 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a1e5e65-8e08-44b2-aac0-2b446cd6d516" containerName="util" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.879273 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a1e5e65-8e08-44b2-aac0-2b446cd6d516" containerName="extract" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.880198 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.884262 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.884570 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.885120 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.885199 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.885514 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-gm7h6" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.885658 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.885751 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.892756 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.892919 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.908881 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.937262 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.937307 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.937335 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.937353 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.937374 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.937476 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.937592 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.937628 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.937657 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.937683 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.937746 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.937855 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.937947 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.937986 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:33 crc kubenswrapper[4867]: I0320 00:20:33.938039 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.039234 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.039276 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.039300 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.039335 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.039354 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.039376 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.039392 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.039412 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.039430 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.039449 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.039469 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.039504 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.039521 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.039542 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.039567 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.040202 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.041141 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.047344 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.047397 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.048354 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.050872 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.051052 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.052233 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.052359 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.052437 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.052695 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.054946 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.060208 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.061030 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.061361 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/f4b5c2ce-c1f8-4340-acd1-699ed169fcfb-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:34 crc kubenswrapper[4867]: I0320 00:20:34.193800 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:40 crc kubenswrapper[4867]: I0320 00:20:40.204881 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 20 00:20:40 crc kubenswrapper[4867]: I0320 00:20:40.703288 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb","Type":"ContainerStarted","Data":"4e9f0929f019996de242792dcc77879d73055fcfe9d26bb249da83cff4a2ffc4"} Mar 20 00:20:41 crc kubenswrapper[4867]: I0320 00:20:41.722514 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-hv6bm" event={"ID":"4c6c5309-a9c3-456c-b6d6-2af53b15791a","Type":"ContainerStarted","Data":"dd584fd546394f69fd667a5747dbe7f23e8cebf84a49edbfdc08549446071b8c"} Mar 20 00:20:41 crc kubenswrapper[4867]: I0320 00:20:41.725309 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rphtl" event={"ID":"1855dda1-516d-48b8-ada8-1261bfd820e3","Type":"ContainerStarted","Data":"7591fee1c3f6c8e0208825ce19396db271e8cbce4b0a312056c7f4af2030d854"} Mar 20 00:20:41 crc kubenswrapper[4867]: I0320 00:20:41.747557 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-hv6bm" podStartSLOduration=8.039502019 podStartE2EDuration="18.747513893s" podCreationTimestamp="2026-03-20 00:20:23 +0000 UTC" firstStartedPulling="2026-03-20 00:20:29.105361184 +0000 UTC m=+843.331898701" lastFinishedPulling="2026-03-20 00:20:39.813373058 +0000 UTC m=+854.039910575" observedRunningTime="2026-03-20 00:20:41.744784792 +0000 UTC m=+855.971322309" watchObservedRunningTime="2026-03-20 00:20:41.747513893 +0000 UTC m=+855.974051410" Mar 20 00:20:41 crc kubenswrapper[4867]: I0320 00:20:41.782049 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rphtl" podStartSLOduration=16.617110796 podStartE2EDuration="26.782034161s" podCreationTimestamp="2026-03-20 00:20:15 +0000 UTC" firstStartedPulling="2026-03-20 00:20:29.601176464 +0000 UTC m=+843.827713971" lastFinishedPulling="2026-03-20 00:20:39.766099809 +0000 UTC m=+853.992637336" observedRunningTime="2026-03-20 00:20:41.775416059 +0000 UTC m=+856.001953576" watchObservedRunningTime="2026-03-20 00:20:41.782034161 +0000 UTC m=+856.008571678" Mar 20 00:20:42 crc kubenswrapper[4867]: I0320 00:20:42.037124 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vnj5f" Mar 20 00:20:45 crc kubenswrapper[4867]: I0320 00:20:45.945616 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vnj5f"] Mar 20 00:20:45 crc kubenswrapper[4867]: I0320 00:20:45.946422 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vnj5f" podUID="5e17fc1b-5947-4b4c-8796-599a03301f8b" containerName="registry-server" containerID="cri-o://76ee174d81cee9285ebc39a5406195a91be4c9ad8e4e679b32843c07a4653e48" gracePeriod=2 Mar 20 00:20:46 crc kubenswrapper[4867]: I0320 00:20:46.313012 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rphtl" Mar 20 00:20:46 crc kubenswrapper[4867]: I0320 00:20:46.313265 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rphtl" Mar 20 00:20:46 crc kubenswrapper[4867]: I0320 00:20:46.781622 4867 generic.go:334] "Generic (PLEG): container finished" podID="5e17fc1b-5947-4b4c-8796-599a03301f8b" containerID="76ee174d81cee9285ebc39a5406195a91be4c9ad8e4e679b32843c07a4653e48" exitCode=0 Mar 20 00:20:46 crc kubenswrapper[4867]: I0320 00:20:46.781675 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnj5f" event={"ID":"5e17fc1b-5947-4b4c-8796-599a03301f8b","Type":"ContainerDied","Data":"76ee174d81cee9285ebc39a5406195a91be4c9ad8e4e679b32843c07a4653e48"} Mar 20 00:20:47 crc kubenswrapper[4867]: I0320 00:20:47.144743 4867 scope.go:117] "RemoveContainer" containerID="d2c383227d0bb7ea2a5692407530a2a7e7be5abf41ca9501791a8dd92e9681e4" Mar 20 00:20:47 crc kubenswrapper[4867]: I0320 00:20:47.416201 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rphtl" podUID="1855dda1-516d-48b8-ada8-1261bfd820e3" containerName="registry-server" probeResult="failure" output=< Mar 20 00:20:47 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Mar 20 00:20:47 crc kubenswrapper[4867]: > Mar 20 00:20:48 crc kubenswrapper[4867]: I0320 00:20:48.860655 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:20:48 crc kubenswrapper[4867]: I0320 00:20:48.861025 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:20:51 crc kubenswrapper[4867]: E0320 00:20:51.873277 4867 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 76ee174d81cee9285ebc39a5406195a91be4c9ad8e4e679b32843c07a4653e48 is running failed: container process not found" containerID="76ee174d81cee9285ebc39a5406195a91be4c9ad8e4e679b32843c07a4653e48" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 00:20:51 crc kubenswrapper[4867]: E0320 00:20:51.874088 4867 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 76ee174d81cee9285ebc39a5406195a91be4c9ad8e4e679b32843c07a4653e48 is running failed: container process not found" containerID="76ee174d81cee9285ebc39a5406195a91be4c9ad8e4e679b32843c07a4653e48" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 00:20:51 crc kubenswrapper[4867]: E0320 00:20:51.874716 4867 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 76ee174d81cee9285ebc39a5406195a91be4c9ad8e4e679b32843c07a4653e48 is running failed: container process not found" containerID="76ee174d81cee9285ebc39a5406195a91be4c9ad8e4e679b32843c07a4653e48" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 00:20:51 crc kubenswrapper[4867]: E0320 00:20:51.874778 4867 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 76ee174d81cee9285ebc39a5406195a91be4c9ad8e4e679b32843c07a4653e48 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-vnj5f" podUID="5e17fc1b-5947-4b4c-8796-599a03301f8b" containerName="registry-server" Mar 20 00:20:52 crc kubenswrapper[4867]: I0320 00:20:52.256558 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-fc42v"] Mar 20 00:20:52 crc kubenswrapper[4867]: I0320 00:20:52.257550 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-fc42v" Mar 20 00:20:52 crc kubenswrapper[4867]: W0320 00:20:52.259995 4867 reflector.go:561] object-"cert-manager-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "cert-manager-operator": no relationship found between node 'crc' and this object Mar 20 00:20:52 crc kubenswrapper[4867]: W0320 00:20:52.260024 4867 reflector.go:561] object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-p4gbp": failed to list *v1.Secret: secrets "cert-manager-operator-controller-manager-dockercfg-p4gbp" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "cert-manager-operator": no relationship found between node 'crc' and this object Mar 20 00:20:52 crc kubenswrapper[4867]: E0320 00:20:52.260036 4867 reflector.go:158] "Unhandled Error" err="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"cert-manager-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 00:20:52 crc kubenswrapper[4867]: E0320 00:20:52.260057 4867 reflector.go:158] "Unhandled Error" err="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-p4gbp\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cert-manager-operator-controller-manager-dockercfg-p4gbp\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"cert-manager-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 00:20:52 crc kubenswrapper[4867]: W0320 00:20:52.260101 4867 reflector.go:561] object-"cert-manager-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "cert-manager-operator": no relationship found between node 'crc' and this object Mar 20 00:20:52 crc kubenswrapper[4867]: E0320 00:20:52.260113 4867 reflector.go:158] "Unhandled Error" err="object-\"cert-manager-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"cert-manager-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 00:20:52 crc kubenswrapper[4867]: I0320 00:20:52.279789 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-fc42v"] Mar 20 00:20:52 crc kubenswrapper[4867]: I0320 00:20:52.357450 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fllp8\" (UniqueName: \"kubernetes.io/projected/7f1c1f73-c046-4599-941a-4941a352d746-kube-api-access-fllp8\") pod \"cert-manager-operator-controller-manager-5586865c96-fc42v\" (UID: \"7f1c1f73-c046-4599-941a-4941a352d746\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-fc42v" Mar 20 00:20:52 crc kubenswrapper[4867]: I0320 00:20:52.357584 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f1c1f73-c046-4599-941a-4941a352d746-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-fc42v\" (UID: \"7f1c1f73-c046-4599-941a-4941a352d746\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-fc42v" Mar 20 00:20:52 crc kubenswrapper[4867]: I0320 00:20:52.458704 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fllp8\" (UniqueName: \"kubernetes.io/projected/7f1c1f73-c046-4599-941a-4941a352d746-kube-api-access-fllp8\") pod \"cert-manager-operator-controller-manager-5586865c96-fc42v\" (UID: \"7f1c1f73-c046-4599-941a-4941a352d746\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-fc42v" Mar 20 00:20:52 crc kubenswrapper[4867]: I0320 00:20:52.458828 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f1c1f73-c046-4599-941a-4941a352d746-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-fc42v\" (UID: \"7f1c1f73-c046-4599-941a-4941a352d746\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-fc42v" Mar 20 00:20:52 crc kubenswrapper[4867]: I0320 00:20:52.460442 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f1c1f73-c046-4599-941a-4941a352d746-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-fc42v\" (UID: \"7f1c1f73-c046-4599-941a-4941a352d746\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-fc42v" Mar 20 00:20:53 crc kubenswrapper[4867]: I0320 00:20:53.347945 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 20 00:20:53 crc kubenswrapper[4867]: I0320 00:20:53.568839 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 20 00:20:53 crc kubenswrapper[4867]: I0320 00:20:53.579061 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fllp8\" (UniqueName: \"kubernetes.io/projected/7f1c1f73-c046-4599-941a-4941a352d746-kube-api-access-fllp8\") pod \"cert-manager-operator-controller-manager-5586865c96-fc42v\" (UID: \"7f1c1f73-c046-4599-941a-4941a352d746\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-fc42v" Mar 20 00:20:53 crc kubenswrapper[4867]: I0320 00:20:53.737391 4867 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-p4gbp" Mar 20 00:20:53 crc kubenswrapper[4867]: I0320 00:20:53.779231 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-fc42v" Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.062282 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnj5f" Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.179875 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv9dn\" (UniqueName: \"kubernetes.io/projected/5e17fc1b-5947-4b4c-8796-599a03301f8b-kube-api-access-tv9dn\") pod \"5e17fc1b-5947-4b4c-8796-599a03301f8b\" (UID: \"5e17fc1b-5947-4b4c-8796-599a03301f8b\") " Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.179935 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e17fc1b-5947-4b4c-8796-599a03301f8b-utilities\") pod \"5e17fc1b-5947-4b4c-8796-599a03301f8b\" (UID: \"5e17fc1b-5947-4b4c-8796-599a03301f8b\") " Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.180004 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e17fc1b-5947-4b4c-8796-599a03301f8b-catalog-content\") pod \"5e17fc1b-5947-4b4c-8796-599a03301f8b\" (UID: \"5e17fc1b-5947-4b4c-8796-599a03301f8b\") " Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.180851 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e17fc1b-5947-4b4c-8796-599a03301f8b-utilities" (OuterVolumeSpecName: "utilities") pod "5e17fc1b-5947-4b4c-8796-599a03301f8b" (UID: "5e17fc1b-5947-4b4c-8796-599a03301f8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.184242 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e17fc1b-5947-4b4c-8796-599a03301f8b-kube-api-access-tv9dn" (OuterVolumeSpecName: "kube-api-access-tv9dn") pod "5e17fc1b-5947-4b4c-8796-599a03301f8b" (UID: "5e17fc1b-5947-4b4c-8796-599a03301f8b"). InnerVolumeSpecName "kube-api-access-tv9dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.231940 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e17fc1b-5947-4b4c-8796-599a03301f8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e17fc1b-5947-4b4c-8796-599a03301f8b" (UID: "5e17fc1b-5947-4b4c-8796-599a03301f8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.281343 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e17fc1b-5947-4b4c-8796-599a03301f8b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.281663 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv9dn\" (UniqueName: \"kubernetes.io/projected/5e17fc1b-5947-4b4c-8796-599a03301f8b-kube-api-access-tv9dn\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.281680 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e17fc1b-5947-4b4c-8796-599a03301f8b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.574557 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-fc42v"] Mar 20 00:20:54 crc kubenswrapper[4867]: W0320 00:20:54.582638 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f1c1f73_c046_4599_941a_4941a352d746.slice/crio-73f352ee829f2f1d465406f8729f69306d0e8cbc5d3b2029de304184504a18c0 WatchSource:0}: Error finding container 73f352ee829f2f1d465406f8729f69306d0e8cbc5d3b2029de304184504a18c0: Status 404 returned error can't find the container with id 73f352ee829f2f1d465406f8729f69306d0e8cbc5d3b2029de304184504a18c0 Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.832915 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-fc42v" event={"ID":"7f1c1f73-c046-4599-941a-4941a352d746","Type":"ContainerStarted","Data":"73f352ee829f2f1d465406f8729f69306d0e8cbc5d3b2029de304184504a18c0"} Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.835066 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnj5f" event={"ID":"5e17fc1b-5947-4b4c-8796-599a03301f8b","Type":"ContainerDied","Data":"9f703220c07a83d060b44021667df7b3af45be97d548dc40747d29d823e6876b"} Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.835091 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnj5f" Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.835101 4867 scope.go:117] "RemoveContainer" containerID="76ee174d81cee9285ebc39a5406195a91be4c9ad8e4e679b32843c07a4653e48" Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.852376 4867 scope.go:117] "RemoveContainer" containerID="d02785415e5c5e9cc73ac81e3be72cab0a02e775e0f910c7845d4a42007265fe" Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.857601 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vnj5f"] Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.861655 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vnj5f"] Mar 20 00:20:54 crc kubenswrapper[4867]: I0320 00:20:54.874830 4867 scope.go:117] "RemoveContainer" containerID="96bbad4c5993ce160da361c7aea4d16f3daa6a384ea7054b9c3c0c6fb1bb9364" Mar 20 00:20:55 crc kubenswrapper[4867]: I0320 00:20:55.842407 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb","Type":"ContainerStarted","Data":"008020a54aea858cd25922a937a222ac39f34e458dc10978951b0248c1c6c868"} Mar 20 00:20:56 crc kubenswrapper[4867]: I0320 00:20:56.112392 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 20 00:20:56 crc kubenswrapper[4867]: I0320 00:20:56.148430 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 20 00:20:56 crc kubenswrapper[4867]: I0320 00:20:56.353674 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rphtl" Mar 20 00:20:56 crc kubenswrapper[4867]: I0320 00:20:56.393659 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rphtl" Mar 20 00:20:56 crc kubenswrapper[4867]: I0320 00:20:56.430667 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e17fc1b-5947-4b4c-8796-599a03301f8b" path="/var/lib/kubelet/pods/5e17fc1b-5947-4b4c-8796-599a03301f8b/volumes" Mar 20 00:20:57 crc kubenswrapper[4867]: I0320 00:20:57.352221 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rphtl"] Mar 20 00:20:57 crc kubenswrapper[4867]: I0320 00:20:57.854571 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-fc42v" event={"ID":"7f1c1f73-c046-4599-941a-4941a352d746","Type":"ContainerStarted","Data":"b51c2ce350361b0b7c2570fd278ee9d863abe44b029432e9279ee6b014a332e4"} Mar 20 00:20:57 crc kubenswrapper[4867]: I0320 00:20:57.856449 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b5c2ce-c1f8-4340-acd1-699ed169fcfb" containerID="008020a54aea858cd25922a937a222ac39f34e458dc10978951b0248c1c6c868" exitCode=0 Mar 20 00:20:57 crc kubenswrapper[4867]: I0320 00:20:57.856527 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb","Type":"ContainerDied","Data":"008020a54aea858cd25922a937a222ac39f34e458dc10978951b0248c1c6c868"} Mar 20 00:20:57 crc kubenswrapper[4867]: I0320 00:20:57.856730 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rphtl" podUID="1855dda1-516d-48b8-ada8-1261bfd820e3" containerName="registry-server" containerID="cri-o://7591fee1c3f6c8e0208825ce19396db271e8cbce4b0a312056c7f4af2030d854" gracePeriod=2 Mar 20 00:20:57 crc kubenswrapper[4867]: I0320 00:20:57.884038 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-fc42v" podStartSLOduration=3.634912576 podStartE2EDuration="5.884014507s" podCreationTimestamp="2026-03-20 00:20:52 +0000 UTC" firstStartedPulling="2026-03-20 00:20:54.586478689 +0000 UTC m=+868.813016206" lastFinishedPulling="2026-03-20 00:20:56.83558062 +0000 UTC m=+871.062118137" observedRunningTime="2026-03-20 00:20:57.87987163 +0000 UTC m=+872.106409157" watchObservedRunningTime="2026-03-20 00:20:57.884014507 +0000 UTC m=+872.110552064" Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.247293 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rphtl" Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.336991 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1855dda1-516d-48b8-ada8-1261bfd820e3-utilities\") pod \"1855dda1-516d-48b8-ada8-1261bfd820e3\" (UID: \"1855dda1-516d-48b8-ada8-1261bfd820e3\") " Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.337133 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1855dda1-516d-48b8-ada8-1261bfd820e3-catalog-content\") pod \"1855dda1-516d-48b8-ada8-1261bfd820e3\" (UID: \"1855dda1-516d-48b8-ada8-1261bfd820e3\") " Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.337207 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z9qr\" (UniqueName: \"kubernetes.io/projected/1855dda1-516d-48b8-ada8-1261bfd820e3-kube-api-access-8z9qr\") pod \"1855dda1-516d-48b8-ada8-1261bfd820e3\" (UID: \"1855dda1-516d-48b8-ada8-1261bfd820e3\") " Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.339056 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1855dda1-516d-48b8-ada8-1261bfd820e3-utilities" (OuterVolumeSpecName: "utilities") pod "1855dda1-516d-48b8-ada8-1261bfd820e3" (UID: "1855dda1-516d-48b8-ada8-1261bfd820e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.345690 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1855dda1-516d-48b8-ada8-1261bfd820e3-kube-api-access-8z9qr" (OuterVolumeSpecName: "kube-api-access-8z9qr") pod "1855dda1-516d-48b8-ada8-1261bfd820e3" (UID: "1855dda1-516d-48b8-ada8-1261bfd820e3"). InnerVolumeSpecName "kube-api-access-8z9qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.441938 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z9qr\" (UniqueName: \"kubernetes.io/projected/1855dda1-516d-48b8-ada8-1261bfd820e3-kube-api-access-8z9qr\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.441977 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1855dda1-516d-48b8-ada8-1261bfd820e3-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.481782 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1855dda1-516d-48b8-ada8-1261bfd820e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1855dda1-516d-48b8-ada8-1261bfd820e3" (UID: "1855dda1-516d-48b8-ada8-1261bfd820e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.543329 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1855dda1-516d-48b8-ada8-1261bfd820e3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.865222 4867 generic.go:334] "Generic (PLEG): container finished" podID="1855dda1-516d-48b8-ada8-1261bfd820e3" containerID="7591fee1c3f6c8e0208825ce19396db271e8cbce4b0a312056c7f4af2030d854" exitCode=0 Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.865332 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rphtl" event={"ID":"1855dda1-516d-48b8-ada8-1261bfd820e3","Type":"ContainerDied","Data":"7591fee1c3f6c8e0208825ce19396db271e8cbce4b0a312056c7f4af2030d854"} Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.865366 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rphtl" event={"ID":"1855dda1-516d-48b8-ada8-1261bfd820e3","Type":"ContainerDied","Data":"6d50cd320959900f5924953e09a2b8abe497682fc6f43b5978c18393c7a009cf"} Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.865407 4867 scope.go:117] "RemoveContainer" containerID="7591fee1c3f6c8e0208825ce19396db271e8cbce4b0a312056c7f4af2030d854" Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.865618 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rphtl" Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.869308 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b5c2ce-c1f8-4340-acd1-699ed169fcfb" containerID="2bdb067e15ff822e17819e31ec2e7dd5288e2f53889d06fcc81ebe7aff2b751c" exitCode=0 Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.869407 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb","Type":"ContainerDied","Data":"2bdb067e15ff822e17819e31ec2e7dd5288e2f53889d06fcc81ebe7aff2b751c"} Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.902910 4867 scope.go:117] "RemoveContainer" containerID="df095c9f496e89496cf354db46ab13a9388f7633dfe6682b4d5e3f13ea4b0f61" Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.953648 4867 scope.go:117] "RemoveContainer" containerID="4d6f6082eeb264c604e987c5d500690baa5a2852191e5789af694405999872d2" Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.955289 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rphtl"] Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.965313 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rphtl"] Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.983715 4867 scope.go:117] "RemoveContainer" containerID="7591fee1c3f6c8e0208825ce19396db271e8cbce4b0a312056c7f4af2030d854" Mar 20 00:20:58 crc kubenswrapper[4867]: E0320 00:20:58.984119 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7591fee1c3f6c8e0208825ce19396db271e8cbce4b0a312056c7f4af2030d854\": container with ID starting with 7591fee1c3f6c8e0208825ce19396db271e8cbce4b0a312056c7f4af2030d854 not found: ID does not exist" containerID="7591fee1c3f6c8e0208825ce19396db271e8cbce4b0a312056c7f4af2030d854" Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.984161 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7591fee1c3f6c8e0208825ce19396db271e8cbce4b0a312056c7f4af2030d854"} err="failed to get container status \"7591fee1c3f6c8e0208825ce19396db271e8cbce4b0a312056c7f4af2030d854\": rpc error: code = NotFound desc = could not find container \"7591fee1c3f6c8e0208825ce19396db271e8cbce4b0a312056c7f4af2030d854\": container with ID starting with 7591fee1c3f6c8e0208825ce19396db271e8cbce4b0a312056c7f4af2030d854 not found: ID does not exist" Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.984186 4867 scope.go:117] "RemoveContainer" containerID="df095c9f496e89496cf354db46ab13a9388f7633dfe6682b4d5e3f13ea4b0f61" Mar 20 00:20:58 crc kubenswrapper[4867]: E0320 00:20:58.984464 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df095c9f496e89496cf354db46ab13a9388f7633dfe6682b4d5e3f13ea4b0f61\": container with ID starting with df095c9f496e89496cf354db46ab13a9388f7633dfe6682b4d5e3f13ea4b0f61 not found: ID does not exist" containerID="df095c9f496e89496cf354db46ab13a9388f7633dfe6682b4d5e3f13ea4b0f61" Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.984507 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df095c9f496e89496cf354db46ab13a9388f7633dfe6682b4d5e3f13ea4b0f61"} err="failed to get container status \"df095c9f496e89496cf354db46ab13a9388f7633dfe6682b4d5e3f13ea4b0f61\": rpc error: code = NotFound desc = could not find container \"df095c9f496e89496cf354db46ab13a9388f7633dfe6682b4d5e3f13ea4b0f61\": container with ID starting with df095c9f496e89496cf354db46ab13a9388f7633dfe6682b4d5e3f13ea4b0f61 not found: ID does not exist" Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.984527 4867 scope.go:117] "RemoveContainer" containerID="4d6f6082eeb264c604e987c5d500690baa5a2852191e5789af694405999872d2" Mar 20 00:20:58 crc kubenswrapper[4867]: E0320 00:20:58.984914 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d6f6082eeb264c604e987c5d500690baa5a2852191e5789af694405999872d2\": container with ID starting with 4d6f6082eeb264c604e987c5d500690baa5a2852191e5789af694405999872d2 not found: ID does not exist" containerID="4d6f6082eeb264c604e987c5d500690baa5a2852191e5789af694405999872d2" Mar 20 00:20:58 crc kubenswrapper[4867]: I0320 00:20:58.984965 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d6f6082eeb264c604e987c5d500690baa5a2852191e5789af694405999872d2"} err="failed to get container status \"4d6f6082eeb264c604e987c5d500690baa5a2852191e5789af694405999872d2\": rpc error: code = NotFound desc = could not find container \"4d6f6082eeb264c604e987c5d500690baa5a2852191e5789af694405999872d2\": container with ID starting with 4d6f6082eeb264c604e987c5d500690baa5a2852191e5789af694405999872d2 not found: ID does not exist" Mar 20 00:20:59 crc kubenswrapper[4867]: I0320 00:20:59.876554 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"f4b5c2ce-c1f8-4340-acd1-699ed169fcfb","Type":"ContainerStarted","Data":"0e658b80c5b2f109a798de4902d2ab5f23c063c50783dc1712b0a6f0106f5ac1"} Mar 20 00:20:59 crc kubenswrapper[4867]: I0320 00:20:59.877422 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:20:59 crc kubenswrapper[4867]: I0320 00:20:59.912935 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=12.444675551 podStartE2EDuration="26.912918725s" podCreationTimestamp="2026-03-20 00:20:33 +0000 UTC" firstStartedPulling="2026-03-20 00:20:40.228112081 +0000 UTC m=+854.454649598" lastFinishedPulling="2026-03-20 00:20:54.696355255 +0000 UTC m=+868.922892772" observedRunningTime="2026-03-20 00:20:59.909150387 +0000 UTC m=+874.135687914" watchObservedRunningTime="2026-03-20 00:20:59.912918725 +0000 UTC m=+874.139456242" Mar 20 00:21:00 crc kubenswrapper[4867]: I0320 00:21:00.428434 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1855dda1-516d-48b8-ada8-1261bfd820e3" path="/var/lib/kubelet/pods/1855dda1-516d-48b8-ada8-1261bfd820e3/volumes" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.394383 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-p6hww"] Mar 20 00:21:01 crc kubenswrapper[4867]: E0320 00:21:01.394865 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1855dda1-516d-48b8-ada8-1261bfd820e3" containerName="registry-server" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.394879 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1855dda1-516d-48b8-ada8-1261bfd820e3" containerName="registry-server" Mar 20 00:21:01 crc kubenswrapper[4867]: E0320 00:21:01.394896 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e17fc1b-5947-4b4c-8796-599a03301f8b" containerName="registry-server" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.394903 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e17fc1b-5947-4b4c-8796-599a03301f8b" containerName="registry-server" Mar 20 00:21:01 crc kubenswrapper[4867]: E0320 00:21:01.394917 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1855dda1-516d-48b8-ada8-1261bfd820e3" containerName="extract-utilities" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.394926 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1855dda1-516d-48b8-ada8-1261bfd820e3" containerName="extract-utilities" Mar 20 00:21:01 crc kubenswrapper[4867]: E0320 00:21:01.394935 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e17fc1b-5947-4b4c-8796-599a03301f8b" containerName="extract-content" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.394942 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e17fc1b-5947-4b4c-8796-599a03301f8b" containerName="extract-content" Mar 20 00:21:01 crc kubenswrapper[4867]: E0320 00:21:01.394949 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e17fc1b-5947-4b4c-8796-599a03301f8b" containerName="extract-utilities" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.394956 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e17fc1b-5947-4b4c-8796-599a03301f8b" containerName="extract-utilities" Mar 20 00:21:01 crc kubenswrapper[4867]: E0320 00:21:01.394963 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1855dda1-516d-48b8-ada8-1261bfd820e3" containerName="extract-content" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.394970 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1855dda1-516d-48b8-ada8-1261bfd820e3" containerName="extract-content" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.395070 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1855dda1-516d-48b8-ada8-1261bfd820e3" containerName="registry-server" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.395087 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e17fc1b-5947-4b4c-8796-599a03301f8b" containerName="registry-server" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.395445 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-p6hww" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.405330 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.405803 4867 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-ctw6b" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.417457 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.426166 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-p6hww"] Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.480071 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwp7g\" (UniqueName: \"kubernetes.io/projected/6af9c67f-2a80-4c2d-8411-ebb8606657ca-kube-api-access-zwp7g\") pod \"cert-manager-webhook-6888856db4-p6hww\" (UID: \"6af9c67f-2a80-4c2d-8411-ebb8606657ca\") " pod="cert-manager/cert-manager-webhook-6888856db4-p6hww" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.480198 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6af9c67f-2a80-4c2d-8411-ebb8606657ca-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-p6hww\" (UID: \"6af9c67f-2a80-4c2d-8411-ebb8606657ca\") " pod="cert-manager/cert-manager-webhook-6888856db4-p6hww" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.581558 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6af9c67f-2a80-4c2d-8411-ebb8606657ca-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-p6hww\" (UID: \"6af9c67f-2a80-4c2d-8411-ebb8606657ca\") " pod="cert-manager/cert-manager-webhook-6888856db4-p6hww" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.581654 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwp7g\" (UniqueName: \"kubernetes.io/projected/6af9c67f-2a80-4c2d-8411-ebb8606657ca-kube-api-access-zwp7g\") pod \"cert-manager-webhook-6888856db4-p6hww\" (UID: \"6af9c67f-2a80-4c2d-8411-ebb8606657ca\") " pod="cert-manager/cert-manager-webhook-6888856db4-p6hww" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.606654 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6af9c67f-2a80-4c2d-8411-ebb8606657ca-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-p6hww\" (UID: \"6af9c67f-2a80-4c2d-8411-ebb8606657ca\") " pod="cert-manager/cert-manager-webhook-6888856db4-p6hww" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.616438 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwp7g\" (UniqueName: \"kubernetes.io/projected/6af9c67f-2a80-4c2d-8411-ebb8606657ca-kube-api-access-zwp7g\") pod \"cert-manager-webhook-6888856db4-p6hww\" (UID: \"6af9c67f-2a80-4c2d-8411-ebb8606657ca\") " pod="cert-manager/cert-manager-webhook-6888856db4-p6hww" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.708242 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-p6hww" Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.966202 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-p6hww"] Mar 20 00:21:01 crc kubenswrapper[4867]: W0320 00:21:01.973504 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6af9c67f_2a80_4c2d_8411_ebb8606657ca.slice/crio-b6b5735521f6c5ada36f7d849a0a63cc3823a975811187467346031def7a6eba WatchSource:0}: Error finding container b6b5735521f6c5ada36f7d849a0a63cc3823a975811187467346031def7a6eba: Status 404 returned error can't find the container with id b6b5735521f6c5ada36f7d849a0a63cc3823a975811187467346031def7a6eba Mar 20 00:21:01 crc kubenswrapper[4867]: I0320 00:21:01.975991 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 00:21:02 crc kubenswrapper[4867]: I0320 00:21:02.900889 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-p6hww" event={"ID":"6af9c67f-2a80-4c2d-8411-ebb8606657ca","Type":"ContainerStarted","Data":"b6b5735521f6c5ada36f7d849a0a63cc3823a975811187467346031def7a6eba"} Mar 20 00:21:06 crc kubenswrapper[4867]: I0320 00:21:06.943199 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-d72zn"] Mar 20 00:21:06 crc kubenswrapper[4867]: I0320 00:21:06.945918 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-d72zn" Mar 20 00:21:06 crc kubenswrapper[4867]: I0320 00:21:06.949895 4867 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-dgj49" Mar 20 00:21:06 crc kubenswrapper[4867]: I0320 00:21:06.965157 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-p6hww" event={"ID":"6af9c67f-2a80-4c2d-8411-ebb8606657ca","Type":"ContainerStarted","Data":"5757bd9e0039e03cf57fae84b481fde1b849498633579a866bbea4209cb62c4f"} Mar 20 00:21:06 crc kubenswrapper[4867]: I0320 00:21:06.966064 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-p6hww" Mar 20 00:21:06 crc kubenswrapper[4867]: I0320 00:21:06.983430 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-d72zn"] Mar 20 00:21:06 crc kubenswrapper[4867]: I0320 00:21:06.984141 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/edaa424d-60d3-4146-be50-de91d495771c-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-d72zn\" (UID: \"edaa424d-60d3-4146-be50-de91d495771c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-d72zn" Mar 20 00:21:06 crc kubenswrapper[4867]: I0320 00:21:06.984196 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j69m4\" (UniqueName: \"kubernetes.io/projected/edaa424d-60d3-4146-be50-de91d495771c-kube-api-access-j69m4\") pod \"cert-manager-cainjector-5545bd876-d72zn\" (UID: \"edaa424d-60d3-4146-be50-de91d495771c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-d72zn" Mar 20 00:21:07 crc kubenswrapper[4867]: I0320 00:21:07.000813 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-p6hww" podStartSLOduration=1.1855786990000001 podStartE2EDuration="6.000792295s" podCreationTimestamp="2026-03-20 00:21:01 +0000 UTC" firstStartedPulling="2026-03-20 00:21:01.975644631 +0000 UTC m=+876.202182148" lastFinishedPulling="2026-03-20 00:21:06.790858227 +0000 UTC m=+881.017395744" observedRunningTime="2026-03-20 00:21:06.997172861 +0000 UTC m=+881.223710408" watchObservedRunningTime="2026-03-20 00:21:07.000792295 +0000 UTC m=+881.227329812" Mar 20 00:21:07 crc kubenswrapper[4867]: I0320 00:21:07.085060 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/edaa424d-60d3-4146-be50-de91d495771c-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-d72zn\" (UID: \"edaa424d-60d3-4146-be50-de91d495771c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-d72zn" Mar 20 00:21:07 crc kubenswrapper[4867]: I0320 00:21:07.085138 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j69m4\" (UniqueName: \"kubernetes.io/projected/edaa424d-60d3-4146-be50-de91d495771c-kube-api-access-j69m4\") pod \"cert-manager-cainjector-5545bd876-d72zn\" (UID: \"edaa424d-60d3-4146-be50-de91d495771c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-d72zn" Mar 20 00:21:07 crc kubenswrapper[4867]: I0320 00:21:07.113593 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j69m4\" (UniqueName: \"kubernetes.io/projected/edaa424d-60d3-4146-be50-de91d495771c-kube-api-access-j69m4\") pod \"cert-manager-cainjector-5545bd876-d72zn\" (UID: \"edaa424d-60d3-4146-be50-de91d495771c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-d72zn" Mar 20 00:21:07 crc kubenswrapper[4867]: I0320 00:21:07.120705 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/edaa424d-60d3-4146-be50-de91d495771c-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-d72zn\" (UID: \"edaa424d-60d3-4146-be50-de91d495771c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-d72zn" Mar 20 00:21:07 crc kubenswrapper[4867]: I0320 00:21:07.262411 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-d72zn" Mar 20 00:21:07 crc kubenswrapper[4867]: I0320 00:21:07.755558 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-d72zn"] Mar 20 00:21:07 crc kubenswrapper[4867]: W0320 00:21:07.761998 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedaa424d_60d3_4146_be50_de91d495771c.slice/crio-575a470691ea1fdffae735323f70cd2ec60471383393d8f2d337908f0b7b0d57 WatchSource:0}: Error finding container 575a470691ea1fdffae735323f70cd2ec60471383393d8f2d337908f0b7b0d57: Status 404 returned error can't find the container with id 575a470691ea1fdffae735323f70cd2ec60471383393d8f2d337908f0b7b0d57 Mar 20 00:21:07 crc kubenswrapper[4867]: I0320 00:21:07.973452 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-d72zn" event={"ID":"edaa424d-60d3-4146-be50-de91d495771c","Type":"ContainerStarted","Data":"291dce747929c0e9c53a90fc0de06728df5e16274f0a443e32a907b4bcaf9348"} Mar 20 00:21:07 crc kubenswrapper[4867]: I0320 00:21:07.973485 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-d72zn" event={"ID":"edaa424d-60d3-4146-be50-de91d495771c","Type":"ContainerStarted","Data":"575a470691ea1fdffae735323f70cd2ec60471383393d8f2d337908f0b7b0d57"} Mar 20 00:21:09 crc kubenswrapper[4867]: I0320 00:21:09.300527 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="f4b5c2ce-c1f8-4340-acd1-699ed169fcfb" containerName="elasticsearch" probeResult="failure" output=< Mar 20 00:21:09 crc kubenswrapper[4867]: {"timestamp": "2026-03-20T00:21:09+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 20 00:21:09 crc kubenswrapper[4867]: > Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.167918 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-d72zn" podStartSLOduration=5.16789674 podStartE2EDuration="5.16789674s" podCreationTimestamp="2026-03-20 00:21:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:21:07.991130301 +0000 UTC m=+882.217667818" watchObservedRunningTime="2026-03-20 00:21:11.16789674 +0000 UTC m=+885.394434267" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.169789 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.170687 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.184415 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.184968 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.185131 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-526rz" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.185440 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.196847 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.237114 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqpph\" (UniqueName: \"kubernetes.io/projected/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-kube-api-access-nqpph\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.237167 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-builder-dockercfg-526rz-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.237191 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.237216 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.237237 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-builder-dockercfg-526rz-push\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.237299 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.237322 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.237338 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.237376 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.237401 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.237425 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.237449 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.338862 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.338913 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.338935 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.338973 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.338995 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.339011 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.339032 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.339054 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqpph\" (UniqueName: \"kubernetes.io/projected/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-kube-api-access-nqpph\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.339185 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.339080 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-builder-dockercfg-526rz-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.339448 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.339473 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.339543 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.339573 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-builder-dockercfg-526rz-push\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.339653 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.339688 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.339802 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.339894 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.339937 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.340362 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.340492 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.355292 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-builder-dockercfg-526rz-push\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.355344 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-builder-dockercfg-526rz-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.366405 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqpph\" (UniqueName: \"kubernetes.io/projected/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-kube-api-access-nqpph\") pod \"service-telemetry-operator-1-build\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.485450 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.711359 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-p6hww" Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.885191 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 20 00:21:11 crc kubenswrapper[4867]: W0320 00:21:11.891611 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f3a93a7_6cdf_4009_9501_0c9be2f9cecf.slice/crio-f8db18d6b88f7f4b65a31870acc10eb8b0254a22d57e70093cf0823566521b41 WatchSource:0}: Error finding container f8db18d6b88f7f4b65a31870acc10eb8b0254a22d57e70093cf0823566521b41: Status 404 returned error can't find the container with id f8db18d6b88f7f4b65a31870acc10eb8b0254a22d57e70093cf0823566521b41 Mar 20 00:21:11 crc kubenswrapper[4867]: I0320 00:21:11.994687 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf","Type":"ContainerStarted","Data":"f8db18d6b88f7f4b65a31870acc10eb8b0254a22d57e70093cf0823566521b41"} Mar 20 00:21:14 crc kubenswrapper[4867]: I0320 00:21:14.497570 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Mar 20 00:21:18 crc kubenswrapper[4867]: I0320 00:21:18.860880 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:21:18 crc kubenswrapper[4867]: I0320 00:21:18.862148 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:21:20 crc kubenswrapper[4867]: I0320 00:21:20.374760 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-hkcm2"] Mar 20 00:21:20 crc kubenswrapper[4867]: I0320 00:21:20.375547 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-hkcm2" Mar 20 00:21:20 crc kubenswrapper[4867]: I0320 00:21:20.378952 4867 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-q5q79" Mar 20 00:21:20 crc kubenswrapper[4867]: I0320 00:21:20.384268 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-hkcm2"] Mar 20 00:21:20 crc kubenswrapper[4867]: I0320 00:21:20.479607 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/673670e9-a11e-4623-b235-b8c8aecbf191-bound-sa-token\") pod \"cert-manager-545d4d4674-hkcm2\" (UID: \"673670e9-a11e-4623-b235-b8c8aecbf191\") " pod="cert-manager/cert-manager-545d4d4674-hkcm2" Mar 20 00:21:20 crc kubenswrapper[4867]: I0320 00:21:20.479691 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lld69\" (UniqueName: \"kubernetes.io/projected/673670e9-a11e-4623-b235-b8c8aecbf191-kube-api-access-lld69\") pod \"cert-manager-545d4d4674-hkcm2\" (UID: \"673670e9-a11e-4623-b235-b8c8aecbf191\") " pod="cert-manager/cert-manager-545d4d4674-hkcm2" Mar 20 00:21:20 crc kubenswrapper[4867]: I0320 00:21:20.581043 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lld69\" (UniqueName: \"kubernetes.io/projected/673670e9-a11e-4623-b235-b8c8aecbf191-kube-api-access-lld69\") pod \"cert-manager-545d4d4674-hkcm2\" (UID: \"673670e9-a11e-4623-b235-b8c8aecbf191\") " pod="cert-manager/cert-manager-545d4d4674-hkcm2" Mar 20 00:21:20 crc kubenswrapper[4867]: I0320 00:21:20.581138 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/673670e9-a11e-4623-b235-b8c8aecbf191-bound-sa-token\") pod \"cert-manager-545d4d4674-hkcm2\" (UID: \"673670e9-a11e-4623-b235-b8c8aecbf191\") " pod="cert-manager/cert-manager-545d4d4674-hkcm2" Mar 20 00:21:20 crc kubenswrapper[4867]: I0320 00:21:20.604216 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/673670e9-a11e-4623-b235-b8c8aecbf191-bound-sa-token\") pod \"cert-manager-545d4d4674-hkcm2\" (UID: \"673670e9-a11e-4623-b235-b8c8aecbf191\") " pod="cert-manager/cert-manager-545d4d4674-hkcm2" Mar 20 00:21:20 crc kubenswrapper[4867]: I0320 00:21:20.604411 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lld69\" (UniqueName: \"kubernetes.io/projected/673670e9-a11e-4623-b235-b8c8aecbf191-kube-api-access-lld69\") pod \"cert-manager-545d4d4674-hkcm2\" (UID: \"673670e9-a11e-4623-b235-b8c8aecbf191\") " pod="cert-manager/cert-manager-545d4d4674-hkcm2" Mar 20 00:21:20 crc kubenswrapper[4867]: I0320 00:21:20.701987 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-hkcm2" Mar 20 00:21:21 crc kubenswrapper[4867]: I0320 00:21:21.176451 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 20 00:21:22 crc kubenswrapper[4867]: I0320 00:21:22.809702 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 20 00:21:22 crc kubenswrapper[4867]: I0320 00:21:22.810813 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:22 crc kubenswrapper[4867]: I0320 00:21:22.813222 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Mar 20 00:21:22 crc kubenswrapper[4867]: I0320 00:21:22.813567 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Mar 20 00:21:22 crc kubenswrapper[4867]: I0320 00:21:22.813744 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Mar 20 00:21:22 crc kubenswrapper[4867]: I0320 00:21:22.837885 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 20 00:21:22 crc kubenswrapper[4867]: I0320 00:21:22.910551 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mhdp\" (UniqueName: \"kubernetes.io/projected/66a97697-c1e3-446a-8d03-aaa79491e1c2-kube-api-access-9mhdp\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:22 crc kubenswrapper[4867]: I0320 00:21:22.910601 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:22 crc kubenswrapper[4867]: I0320 00:21:22.910653 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:22 crc kubenswrapper[4867]: I0320 00:21:22.910690 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:22 crc kubenswrapper[4867]: I0320 00:21:22.910713 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:22 crc kubenswrapper[4867]: I0320 00:21:22.910742 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66a97697-c1e3-446a-8d03-aaa79491e1c2-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:22 crc kubenswrapper[4867]: I0320 00:21:22.910780 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:22 crc kubenswrapper[4867]: I0320 00:21:22.910803 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:22 crc kubenswrapper[4867]: I0320 00:21:22.910828 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/66a97697-c1e3-446a-8d03-aaa79491e1c2-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:22 crc kubenswrapper[4867]: I0320 00:21:22.910863 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/66a97697-c1e3-446a-8d03-aaa79491e1c2-builder-dockercfg-526rz-push\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:22 crc kubenswrapper[4867]: I0320 00:21:22.910886 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:22 crc kubenswrapper[4867]: I0320 00:21:22.910911 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/66a97697-c1e3-446a-8d03-aaa79491e1c2-builder-dockercfg-526rz-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.011621 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mhdp\" (UniqueName: \"kubernetes.io/projected/66a97697-c1e3-446a-8d03-aaa79491e1c2-kube-api-access-9mhdp\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.011665 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.011713 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.011747 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.011772 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.011800 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66a97697-c1e3-446a-8d03-aaa79491e1c2-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.011835 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.011856 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.011878 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/66a97697-c1e3-446a-8d03-aaa79491e1c2-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.011914 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/66a97697-c1e3-446a-8d03-aaa79491e1c2-builder-dockercfg-526rz-push\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.011935 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.011959 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/66a97697-c1e3-446a-8d03-aaa79491e1c2-builder-dockercfg-526rz-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.012431 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.012815 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/66a97697-c1e3-446a-8d03-aaa79491e1c2-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.013762 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66a97697-c1e3-446a-8d03-aaa79491e1c2-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.013821 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.013839 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.014167 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.014382 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.014398 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.018475 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/66a97697-c1e3-446a-8d03-aaa79491e1c2-builder-dockercfg-526rz-push\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.029827 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/66a97697-c1e3-446a-8d03-aaa79491e1c2-builder-dockercfg-526rz-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.031173 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.031746 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mhdp\" (UniqueName: \"kubernetes.io/projected/66a97697-c1e3-446a-8d03-aaa79491e1c2-kube-api-access-9mhdp\") pod \"service-telemetry-operator-2-build\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.201356 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:21:23 crc kubenswrapper[4867]: I0320 00:21:23.754413 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-hkcm2"] Mar 20 00:21:23 crc kubenswrapper[4867]: W0320 00:21:23.760259 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod673670e9_a11e_4623_b235_b8c8aecbf191.slice/crio-c4ce60a6263f547084a70f3514fbff7a6c072282d0a27d5cf86f1f9fdbf6c23c WatchSource:0}: Error finding container c4ce60a6263f547084a70f3514fbff7a6c072282d0a27d5cf86f1f9fdbf6c23c: Status 404 returned error can't find the container with id c4ce60a6263f547084a70f3514fbff7a6c072282d0a27d5cf86f1f9fdbf6c23c Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.033304 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 20 00:21:24 crc kubenswrapper[4867]: W0320 00:21:24.043135 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66a97697_c1e3_446a_8d03_aaa79491e1c2.slice/crio-1dd52512dda79f390482cc3f2d899f6d6c6b82efee5544a3bcf1f0720fb3802a WatchSource:0}: Error finding container 1dd52512dda79f390482cc3f2d899f6d6c6b82efee5544a3bcf1f0720fb3802a: Status 404 returned error can't find the container with id 1dd52512dda79f390482cc3f2d899f6d6c6b82efee5544a3bcf1f0720fb3802a Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.070409 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-hkcm2" event={"ID":"673670e9-a11e-4623-b235-b8c8aecbf191","Type":"ContainerStarted","Data":"8613b530ce1390ad26239ba64506b4c493370a61a63006a66ba634eb7ecba5d3"} Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.070722 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-hkcm2" event={"ID":"673670e9-a11e-4623-b235-b8c8aecbf191","Type":"ContainerStarted","Data":"c4ce60a6263f547084a70f3514fbff7a6c072282d0a27d5cf86f1f9fdbf6c23c"} Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.072839 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf","Type":"ContainerStarted","Data":"bb65cbf76c21ebc61cc5e644e50d0c427d57548a2d1691d0dcfbccc24023b4fd"} Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.072974 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="6f3a93a7-6cdf-4009-9501-0c9be2f9cecf" containerName="manage-dockerfile" containerID="cri-o://bb65cbf76c21ebc61cc5e644e50d0c427d57548a2d1691d0dcfbccc24023b4fd" gracePeriod=30 Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.078650 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"66a97697-c1e3-446a-8d03-aaa79491e1c2","Type":"ContainerStarted","Data":"1dd52512dda79f390482cc3f2d899f6d6c6b82efee5544a3bcf1f0720fb3802a"} Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.098177 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-hkcm2" podStartSLOduration=4.098162221 podStartE2EDuration="4.098162221s" podCreationTimestamp="2026-03-20 00:21:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:21:24.097474673 +0000 UTC m=+898.324012190" watchObservedRunningTime="2026-03-20 00:21:24.098162221 +0000 UTC m=+898.324699738" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.430842 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_6f3a93a7-6cdf-4009-9501-0c9be2f9cecf/manage-dockerfile/0.log" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.431242 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.538785 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-node-pullsecrets\") pod \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.538872 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-blob-cache\") pod \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.538880 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf" (UID: "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.538908 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-builder-dockercfg-526rz-pull\") pod \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.538951 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-system-configs\") pod \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.538978 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-buildworkdir\") pod \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.539024 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-buildcachedir\") pod \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.539091 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-container-storage-root\") pod \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.539123 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-proxy-ca-bundles\") pod \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.539160 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqpph\" (UniqueName: \"kubernetes.io/projected/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-kube-api-access-nqpph\") pod \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.539191 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-container-storage-run\") pod \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.539263 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-ca-bundles\") pod \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.539297 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-builder-dockercfg-526rz-push\") pod \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\" (UID: \"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf\") " Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.539377 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf" (UID: "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.539614 4867 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.539632 4867 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.539867 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf" (UID: "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.539891 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf" (UID: "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.539949 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf" (UID: "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.540097 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf" (UID: "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.540209 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf" (UID: "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.540224 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf" (UID: "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.540334 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf" (UID: "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.544193 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-builder-dockercfg-526rz-pull" (OuterVolumeSpecName: "builder-dockercfg-526rz-pull") pod "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf" (UID: "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf"). InnerVolumeSpecName "builder-dockercfg-526rz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.544341 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-kube-api-access-nqpph" (OuterVolumeSpecName: "kube-api-access-nqpph") pod "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf" (UID: "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf"). InnerVolumeSpecName "kube-api-access-nqpph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.544416 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-builder-dockercfg-526rz-push" (OuterVolumeSpecName: "builder-dockercfg-526rz-push") pod "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf" (UID: "6f3a93a7-6cdf-4009-9501-0c9be2f9cecf"). InnerVolumeSpecName "builder-dockercfg-526rz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.640359 4867 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.640398 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-builder-dockercfg-526rz-push\") on node \"crc\" DevicePath \"\"" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.640413 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-builder-dockercfg-526rz-pull\") on node \"crc\" DevicePath \"\"" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.640425 4867 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.640437 4867 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.640448 4867 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.640459 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.640470 4867 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.640482 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqpph\" (UniqueName: \"kubernetes.io/projected/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-kube-api-access-nqpph\") on node \"crc\" DevicePath \"\"" Mar 20 00:21:24 crc kubenswrapper[4867]: I0320 00:21:24.640515 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 20 00:21:25 crc kubenswrapper[4867]: I0320 00:21:25.088130 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_6f3a93a7-6cdf-4009-9501-0c9be2f9cecf/manage-dockerfile/0.log" Mar 20 00:21:25 crc kubenswrapper[4867]: I0320 00:21:25.088209 4867 generic.go:334] "Generic (PLEG): container finished" podID="6f3a93a7-6cdf-4009-9501-0c9be2f9cecf" containerID="bb65cbf76c21ebc61cc5e644e50d0c427d57548a2d1691d0dcfbccc24023b4fd" exitCode=1 Mar 20 00:21:25 crc kubenswrapper[4867]: I0320 00:21:25.088292 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf","Type":"ContainerDied","Data":"bb65cbf76c21ebc61cc5e644e50d0c427d57548a2d1691d0dcfbccc24023b4fd"} Mar 20 00:21:25 crc kubenswrapper[4867]: I0320 00:21:25.088297 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 20 00:21:25 crc kubenswrapper[4867]: I0320 00:21:25.088331 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"6f3a93a7-6cdf-4009-9501-0c9be2f9cecf","Type":"ContainerDied","Data":"f8db18d6b88f7f4b65a31870acc10eb8b0254a22d57e70093cf0823566521b41"} Mar 20 00:21:25 crc kubenswrapper[4867]: I0320 00:21:25.088357 4867 scope.go:117] "RemoveContainer" containerID="bb65cbf76c21ebc61cc5e644e50d0c427d57548a2d1691d0dcfbccc24023b4fd" Mar 20 00:21:25 crc kubenswrapper[4867]: I0320 00:21:25.091290 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"66a97697-c1e3-446a-8d03-aaa79491e1c2","Type":"ContainerStarted","Data":"6000b9324e701ef1cd45facb4bb674068cbf170050ef267790621331656cec0b"} Mar 20 00:21:25 crc kubenswrapper[4867]: I0320 00:21:25.108817 4867 scope.go:117] "RemoveContainer" containerID="bb65cbf76c21ebc61cc5e644e50d0c427d57548a2d1691d0dcfbccc24023b4fd" Mar 20 00:21:25 crc kubenswrapper[4867]: E0320 00:21:25.111764 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb65cbf76c21ebc61cc5e644e50d0c427d57548a2d1691d0dcfbccc24023b4fd\": container with ID starting with bb65cbf76c21ebc61cc5e644e50d0c427d57548a2d1691d0dcfbccc24023b4fd not found: ID does not exist" containerID="bb65cbf76c21ebc61cc5e644e50d0c427d57548a2d1691d0dcfbccc24023b4fd" Mar 20 00:21:25 crc kubenswrapper[4867]: I0320 00:21:25.111815 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb65cbf76c21ebc61cc5e644e50d0c427d57548a2d1691d0dcfbccc24023b4fd"} err="failed to get container status \"bb65cbf76c21ebc61cc5e644e50d0c427d57548a2d1691d0dcfbccc24023b4fd\": rpc error: code = NotFound desc = could not find container \"bb65cbf76c21ebc61cc5e644e50d0c427d57548a2d1691d0dcfbccc24023b4fd\": container with ID starting with bb65cbf76c21ebc61cc5e644e50d0c427d57548a2d1691d0dcfbccc24023b4fd not found: ID does not exist" Mar 20 00:21:25 crc kubenswrapper[4867]: I0320 00:21:25.157706 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 20 00:21:25 crc kubenswrapper[4867]: I0320 00:21:25.167231 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 20 00:21:26 crc kubenswrapper[4867]: I0320 00:21:26.432207 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f3a93a7-6cdf-4009-9501-0c9be2f9cecf" path="/var/lib/kubelet/pods/6f3a93a7-6cdf-4009-9501-0c9be2f9cecf/volumes" Mar 20 00:21:36 crc kubenswrapper[4867]: I0320 00:21:36.172800 4867 generic.go:334] "Generic (PLEG): container finished" podID="66a97697-c1e3-446a-8d03-aaa79491e1c2" containerID="6000b9324e701ef1cd45facb4bb674068cbf170050ef267790621331656cec0b" exitCode=0 Mar 20 00:21:36 crc kubenswrapper[4867]: I0320 00:21:36.172909 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"66a97697-c1e3-446a-8d03-aaa79491e1c2","Type":"ContainerDied","Data":"6000b9324e701ef1cd45facb4bb674068cbf170050ef267790621331656cec0b"} Mar 20 00:21:37 crc kubenswrapper[4867]: I0320 00:21:37.183107 4867 generic.go:334] "Generic (PLEG): container finished" podID="66a97697-c1e3-446a-8d03-aaa79491e1c2" containerID="b84de07d349738ebf95b24e4c2e1c52a87059b38ae6621434bb01c23a5ceaff0" exitCode=0 Mar 20 00:21:37 crc kubenswrapper[4867]: I0320 00:21:37.183153 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"66a97697-c1e3-446a-8d03-aaa79491e1c2","Type":"ContainerDied","Data":"b84de07d349738ebf95b24e4c2e1c52a87059b38ae6621434bb01c23a5ceaff0"} Mar 20 00:21:37 crc kubenswrapper[4867]: I0320 00:21:37.250478 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_66a97697-c1e3-446a-8d03-aaa79491e1c2/manage-dockerfile/0.log" Mar 20 00:21:38 crc kubenswrapper[4867]: I0320 00:21:38.209116 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"66a97697-c1e3-446a-8d03-aaa79491e1c2","Type":"ContainerStarted","Data":"b5b58ba7055425a4c130372522eb7fc12884791e7ffdcef5babfa12238ac6c17"} Mar 20 00:21:48 crc kubenswrapper[4867]: I0320 00:21:48.860088 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:21:48 crc kubenswrapper[4867]: I0320 00:21:48.860610 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:21:48 crc kubenswrapper[4867]: I0320 00:21:48.860660 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:21:48 crc kubenswrapper[4867]: I0320 00:21:48.861277 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"33d423cd234b4249f2b7bca2c14b518d37de21e54afaaa7c2a2edca5bfac00fe"} pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 00:21:48 crc kubenswrapper[4867]: I0320 00:21:48.861343 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" containerID="cri-o://33d423cd234b4249f2b7bca2c14b518d37de21e54afaaa7c2a2edca5bfac00fe" gracePeriod=600 Mar 20 00:21:50 crc kubenswrapper[4867]: I0320 00:21:50.304065 4867 generic.go:334] "Generic (PLEG): container finished" podID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerID="33d423cd234b4249f2b7bca2c14b518d37de21e54afaaa7c2a2edca5bfac00fe" exitCode=0 Mar 20 00:21:50 crc kubenswrapper[4867]: I0320 00:21:50.304149 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" event={"ID":"00eacbd3-d921-414b-8b8d-c4298bdd5a28","Type":"ContainerDied","Data":"33d423cd234b4249f2b7bca2c14b518d37de21e54afaaa7c2a2edca5bfac00fe"} Mar 20 00:21:50 crc kubenswrapper[4867]: I0320 00:21:50.304721 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" event={"ID":"00eacbd3-d921-414b-8b8d-c4298bdd5a28","Type":"ContainerStarted","Data":"34d1897a4a5c08cebf4ce21f533f5206883f8b21f0e746b2aa5a509e335c6921"} Mar 20 00:21:50 crc kubenswrapper[4867]: I0320 00:21:50.304742 4867 scope.go:117] "RemoveContainer" containerID="4f460505b5fefef8711ce12da7e390bf20ce1928a1083fdde393f4aa4fca83e6" Mar 20 00:21:50 crc kubenswrapper[4867]: I0320 00:21:50.323363 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=28.323343361 podStartE2EDuration="28.323343361s" podCreationTimestamp="2026-03-20 00:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:21:38.241774756 +0000 UTC m=+912.468312263" watchObservedRunningTime="2026-03-20 00:21:50.323343361 +0000 UTC m=+924.549880878" Mar 20 00:22:00 crc kubenswrapper[4867]: I0320 00:22:00.146862 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566102-s6lwn"] Mar 20 00:22:00 crc kubenswrapper[4867]: E0320 00:22:00.147570 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f3a93a7-6cdf-4009-9501-0c9be2f9cecf" containerName="manage-dockerfile" Mar 20 00:22:00 crc kubenswrapper[4867]: I0320 00:22:00.147585 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3a93a7-6cdf-4009-9501-0c9be2f9cecf" containerName="manage-dockerfile" Mar 20 00:22:00 crc kubenswrapper[4867]: I0320 00:22:00.147704 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f3a93a7-6cdf-4009-9501-0c9be2f9cecf" containerName="manage-dockerfile" Mar 20 00:22:00 crc kubenswrapper[4867]: I0320 00:22:00.148163 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566102-s6lwn" Mar 20 00:22:00 crc kubenswrapper[4867]: I0320 00:22:00.150948 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 00:22:00 crc kubenswrapper[4867]: I0320 00:22:00.152221 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 00:22:00 crc kubenswrapper[4867]: I0320 00:22:00.152454 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lxzm5" Mar 20 00:22:00 crc kubenswrapper[4867]: I0320 00:22:00.154164 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566102-s6lwn"] Mar 20 00:22:00 crc kubenswrapper[4867]: I0320 00:22:00.337975 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54k9n\" (UniqueName: \"kubernetes.io/projected/6f399014-a31b-4635-8ebf-44180df3444c-kube-api-access-54k9n\") pod \"auto-csr-approver-29566102-s6lwn\" (UID: \"6f399014-a31b-4635-8ebf-44180df3444c\") " pod="openshift-infra/auto-csr-approver-29566102-s6lwn" Mar 20 00:22:00 crc kubenswrapper[4867]: I0320 00:22:00.439851 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54k9n\" (UniqueName: \"kubernetes.io/projected/6f399014-a31b-4635-8ebf-44180df3444c-kube-api-access-54k9n\") pod \"auto-csr-approver-29566102-s6lwn\" (UID: \"6f399014-a31b-4635-8ebf-44180df3444c\") " pod="openshift-infra/auto-csr-approver-29566102-s6lwn" Mar 20 00:22:00 crc kubenswrapper[4867]: I0320 00:22:00.471878 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54k9n\" (UniqueName: \"kubernetes.io/projected/6f399014-a31b-4635-8ebf-44180df3444c-kube-api-access-54k9n\") pod \"auto-csr-approver-29566102-s6lwn\" (UID: \"6f399014-a31b-4635-8ebf-44180df3444c\") " pod="openshift-infra/auto-csr-approver-29566102-s6lwn" Mar 20 00:22:00 crc kubenswrapper[4867]: I0320 00:22:00.763395 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566102-s6lwn" Mar 20 00:22:00 crc kubenswrapper[4867]: I0320 00:22:00.940857 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566102-s6lwn"] Mar 20 00:22:00 crc kubenswrapper[4867]: W0320 00:22:00.946675 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f399014_a31b_4635_8ebf_44180df3444c.slice/crio-9a1af54c551f3aedd78d53941acde075398aeb3b2275d42a75798339d52595d4 WatchSource:0}: Error finding container 9a1af54c551f3aedd78d53941acde075398aeb3b2275d42a75798339d52595d4: Status 404 returned error can't find the container with id 9a1af54c551f3aedd78d53941acde075398aeb3b2275d42a75798339d52595d4 Mar 20 00:22:01 crc kubenswrapper[4867]: I0320 00:22:01.369688 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566102-s6lwn" event={"ID":"6f399014-a31b-4635-8ebf-44180df3444c","Type":"ContainerStarted","Data":"9a1af54c551f3aedd78d53941acde075398aeb3b2275d42a75798339d52595d4"} Mar 20 00:22:02 crc kubenswrapper[4867]: I0320 00:22:02.379783 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566102-s6lwn" event={"ID":"6f399014-a31b-4635-8ebf-44180df3444c","Type":"ContainerStarted","Data":"17a900a2a06674978edc805b394e10ec18d6fb866835f845221484916811b0c3"} Mar 20 00:22:03 crc kubenswrapper[4867]: I0320 00:22:03.402455 4867 generic.go:334] "Generic (PLEG): container finished" podID="6f399014-a31b-4635-8ebf-44180df3444c" containerID="17a900a2a06674978edc805b394e10ec18d6fb866835f845221484916811b0c3" exitCode=0 Mar 20 00:22:03 crc kubenswrapper[4867]: I0320 00:22:03.402512 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566102-s6lwn" event={"ID":"6f399014-a31b-4635-8ebf-44180df3444c","Type":"ContainerDied","Data":"17a900a2a06674978edc805b394e10ec18d6fb866835f845221484916811b0c3"} Mar 20 00:22:04 crc kubenswrapper[4867]: I0320 00:22:04.685519 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566102-s6lwn" Mar 20 00:22:04 crc kubenswrapper[4867]: I0320 00:22:04.812077 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54k9n\" (UniqueName: \"kubernetes.io/projected/6f399014-a31b-4635-8ebf-44180df3444c-kube-api-access-54k9n\") pod \"6f399014-a31b-4635-8ebf-44180df3444c\" (UID: \"6f399014-a31b-4635-8ebf-44180df3444c\") " Mar 20 00:22:04 crc kubenswrapper[4867]: I0320 00:22:04.820072 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f399014-a31b-4635-8ebf-44180df3444c-kube-api-access-54k9n" (OuterVolumeSpecName: "kube-api-access-54k9n") pod "6f399014-a31b-4635-8ebf-44180df3444c" (UID: "6f399014-a31b-4635-8ebf-44180df3444c"). InnerVolumeSpecName "kube-api-access-54k9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:22:04 crc kubenswrapper[4867]: I0320 00:22:04.914015 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54k9n\" (UniqueName: \"kubernetes.io/projected/6f399014-a31b-4635-8ebf-44180df3444c-kube-api-access-54k9n\") on node \"crc\" DevicePath \"\"" Mar 20 00:22:05 crc kubenswrapper[4867]: I0320 00:22:05.418018 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566102-s6lwn" event={"ID":"6f399014-a31b-4635-8ebf-44180df3444c","Type":"ContainerDied","Data":"9a1af54c551f3aedd78d53941acde075398aeb3b2275d42a75798339d52595d4"} Mar 20 00:22:05 crc kubenswrapper[4867]: I0320 00:22:05.418318 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a1af54c551f3aedd78d53941acde075398aeb3b2275d42a75798339d52595d4" Mar 20 00:22:05 crc kubenswrapper[4867]: I0320 00:22:05.418125 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566102-s6lwn" Mar 20 00:22:05 crc kubenswrapper[4867]: I0320 00:22:05.484209 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566096-l8bkt"] Mar 20 00:22:05 crc kubenswrapper[4867]: I0320 00:22:05.491801 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566096-l8bkt"] Mar 20 00:22:06 crc kubenswrapper[4867]: I0320 00:22:06.440931 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2" path="/var/lib/kubelet/pods/e7a8fc2b-967b-4b1f-9ceb-1f7ecbfe9ec2/volumes" Mar 20 00:22:54 crc kubenswrapper[4867]: I0320 00:22:54.376803 4867 scope.go:117] "RemoveContainer" containerID="87550ab71a28fecc4f6573b5137c2dbe64d336c6dae82aa97c920357e2634015" Mar 20 00:23:02 crc kubenswrapper[4867]: I0320 00:23:02.826344 4867 generic.go:334] "Generic (PLEG): container finished" podID="66a97697-c1e3-446a-8d03-aaa79491e1c2" containerID="b5b58ba7055425a4c130372522eb7fc12884791e7ffdcef5babfa12238ac6c17" exitCode=0 Mar 20 00:23:02 crc kubenswrapper[4867]: I0320 00:23:02.826467 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"66a97697-c1e3-446a-8d03-aaa79491e1c2","Type":"ContainerDied","Data":"b5b58ba7055425a4c130372522eb7fc12884791e7ffdcef5babfa12238ac6c17"} Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.077751 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.241217 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/66a97697-c1e3-446a-8d03-aaa79491e1c2-builder-dockercfg-526rz-pull\") pod \"66a97697-c1e3-446a-8d03-aaa79491e1c2\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.241299 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-blob-cache\") pod \"66a97697-c1e3-446a-8d03-aaa79491e1c2\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.241340 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-buildworkdir\") pod \"66a97697-c1e3-446a-8d03-aaa79491e1c2\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.241388 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-ca-bundles\") pod \"66a97697-c1e3-446a-8d03-aaa79491e1c2\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.241485 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-system-configs\") pod \"66a97697-c1e3-446a-8d03-aaa79491e1c2\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.241552 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mhdp\" (UniqueName: \"kubernetes.io/projected/66a97697-c1e3-446a-8d03-aaa79491e1c2-kube-api-access-9mhdp\") pod \"66a97697-c1e3-446a-8d03-aaa79491e1c2\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.241588 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-container-storage-run\") pod \"66a97697-c1e3-446a-8d03-aaa79491e1c2\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.241634 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66a97697-c1e3-446a-8d03-aaa79491e1c2-node-pullsecrets\") pod \"66a97697-c1e3-446a-8d03-aaa79491e1c2\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.241664 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/66a97697-c1e3-446a-8d03-aaa79491e1c2-buildcachedir\") pod \"66a97697-c1e3-446a-8d03-aaa79491e1c2\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.241711 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-proxy-ca-bundles\") pod \"66a97697-c1e3-446a-8d03-aaa79491e1c2\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.241915 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-container-storage-root\") pod \"66a97697-c1e3-446a-8d03-aaa79491e1c2\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.241948 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/66a97697-c1e3-446a-8d03-aaa79491e1c2-builder-dockercfg-526rz-push\") pod \"66a97697-c1e3-446a-8d03-aaa79491e1c2\" (UID: \"66a97697-c1e3-446a-8d03-aaa79491e1c2\") " Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.242728 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "66a97697-c1e3-446a-8d03-aaa79491e1c2" (UID: "66a97697-c1e3-446a-8d03-aaa79491e1c2"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.242739 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66a97697-c1e3-446a-8d03-aaa79491e1c2-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "66a97697-c1e3-446a-8d03-aaa79491e1c2" (UID: "66a97697-c1e3-446a-8d03-aaa79491e1c2"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.242853 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "66a97697-c1e3-446a-8d03-aaa79491e1c2" (UID: "66a97697-c1e3-446a-8d03-aaa79491e1c2"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.242940 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66a97697-c1e3-446a-8d03-aaa79491e1c2-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "66a97697-c1e3-446a-8d03-aaa79491e1c2" (UID: "66a97697-c1e3-446a-8d03-aaa79491e1c2"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.243480 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "66a97697-c1e3-446a-8d03-aaa79491e1c2" (UID: "66a97697-c1e3-446a-8d03-aaa79491e1c2"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.244676 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "66a97697-c1e3-446a-8d03-aaa79491e1c2" (UID: "66a97697-c1e3-446a-8d03-aaa79491e1c2"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.251027 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66a97697-c1e3-446a-8d03-aaa79491e1c2-builder-dockercfg-526rz-push" (OuterVolumeSpecName: "builder-dockercfg-526rz-push") pod "66a97697-c1e3-446a-8d03-aaa79491e1c2" (UID: "66a97697-c1e3-446a-8d03-aaa79491e1c2"). InnerVolumeSpecName "builder-dockercfg-526rz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.252813 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66a97697-c1e3-446a-8d03-aaa79491e1c2-kube-api-access-9mhdp" (OuterVolumeSpecName: "kube-api-access-9mhdp") pod "66a97697-c1e3-446a-8d03-aaa79491e1c2" (UID: "66a97697-c1e3-446a-8d03-aaa79491e1c2"). InnerVolumeSpecName "kube-api-access-9mhdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.259717 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66a97697-c1e3-446a-8d03-aaa79491e1c2-builder-dockercfg-526rz-pull" (OuterVolumeSpecName: "builder-dockercfg-526rz-pull") pod "66a97697-c1e3-446a-8d03-aaa79491e1c2" (UID: "66a97697-c1e3-446a-8d03-aaa79491e1c2"). InnerVolumeSpecName "builder-dockercfg-526rz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.304152 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "66a97697-c1e3-446a-8d03-aaa79491e1c2" (UID: "66a97697-c1e3-446a-8d03-aaa79491e1c2"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.343228 4867 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/66a97697-c1e3-446a-8d03-aaa79491e1c2-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.343298 4867 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.343327 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/66a97697-c1e3-446a-8d03-aaa79491e1c2-builder-dockercfg-526rz-push\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.343353 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/66a97697-c1e3-446a-8d03-aaa79491e1c2-builder-dockercfg-526rz-pull\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.343377 4867 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.343395 4867 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.343411 4867 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.343428 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mhdp\" (UniqueName: \"kubernetes.io/projected/66a97697-c1e3-446a-8d03-aaa79491e1c2-kube-api-access-9mhdp\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.343445 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.343461 4867 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66a97697-c1e3-446a-8d03-aaa79491e1c2-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.428870 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "66a97697-c1e3-446a-8d03-aaa79491e1c2" (UID: "66a97697-c1e3-446a-8d03-aaa79491e1c2"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.446265 4867 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.846299 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"66a97697-c1e3-446a-8d03-aaa79491e1c2","Type":"ContainerDied","Data":"1dd52512dda79f390482cc3f2d899f6d6c6b82efee5544a3bcf1f0720fb3802a"} Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.846353 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dd52512dda79f390482cc3f2d899f6d6c6b82efee5544a3bcf1f0720fb3802a" Mar 20 00:23:04 crc kubenswrapper[4867]: I0320 00:23:04.846433 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 20 00:23:06 crc kubenswrapper[4867]: I0320 00:23:06.628773 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "66a97697-c1e3-446a-8d03-aaa79491e1c2" (UID: "66a97697-c1e3-446a-8d03-aaa79491e1c2"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:23:06 crc kubenswrapper[4867]: I0320 00:23:06.677747 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/66a97697-c1e3-446a-8d03-aaa79491e1c2-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:08 crc kubenswrapper[4867]: I0320 00:23:08.889655 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 20 00:23:08 crc kubenswrapper[4867]: E0320 00:23:08.890182 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66a97697-c1e3-446a-8d03-aaa79491e1c2" containerName="docker-build" Mar 20 00:23:08 crc kubenswrapper[4867]: I0320 00:23:08.890194 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a97697-c1e3-446a-8d03-aaa79491e1c2" containerName="docker-build" Mar 20 00:23:08 crc kubenswrapper[4867]: E0320 00:23:08.890204 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66a97697-c1e3-446a-8d03-aaa79491e1c2" containerName="manage-dockerfile" Mar 20 00:23:08 crc kubenswrapper[4867]: I0320 00:23:08.890209 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a97697-c1e3-446a-8d03-aaa79491e1c2" containerName="manage-dockerfile" Mar 20 00:23:08 crc kubenswrapper[4867]: E0320 00:23:08.890222 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f399014-a31b-4635-8ebf-44180df3444c" containerName="oc" Mar 20 00:23:08 crc kubenswrapper[4867]: I0320 00:23:08.890228 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f399014-a31b-4635-8ebf-44180df3444c" containerName="oc" Mar 20 00:23:08 crc kubenswrapper[4867]: E0320 00:23:08.890239 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66a97697-c1e3-446a-8d03-aaa79491e1c2" containerName="git-clone" Mar 20 00:23:08 crc kubenswrapper[4867]: I0320 00:23:08.890245 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a97697-c1e3-446a-8d03-aaa79491e1c2" containerName="git-clone" Mar 20 00:23:08 crc kubenswrapper[4867]: I0320 00:23:08.890343 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="66a97697-c1e3-446a-8d03-aaa79491e1c2" containerName="docker-build" Mar 20 00:23:08 crc kubenswrapper[4867]: I0320 00:23:08.890352 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f399014-a31b-4635-8ebf-44180df3444c" containerName="oc" Mar 20 00:23:08 crc kubenswrapper[4867]: I0320 00:23:08.890938 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:08 crc kubenswrapper[4867]: I0320 00:23:08.893597 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Mar 20 00:23:08 crc kubenswrapper[4867]: I0320 00:23:08.894076 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-526rz" Mar 20 00:23:08 crc kubenswrapper[4867]: I0320 00:23:08.894223 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Mar 20 00:23:08 crc kubenswrapper[4867]: I0320 00:23:08.894451 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Mar 20 00:23:08 crc kubenswrapper[4867]: I0320 00:23:08.905972 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.047792 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.047872 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.048009 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.048105 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/e7f67f62-3429-4d7f-aaa9-36914083d06e-builder-dockercfg-526rz-push\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.048145 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/e7f67f62-3429-4d7f-aaa9-36914083d06e-builder-dockercfg-526rz-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.048187 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e7f67f62-3429-4d7f-aaa9-36914083d06e-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.048269 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.048304 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.048329 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.048397 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e7f67f62-3429-4d7f-aaa9-36914083d06e-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.048550 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.048596 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwjjz\" (UniqueName: \"kubernetes.io/projected/e7f67f62-3429-4d7f-aaa9-36914083d06e-kube-api-access-rwjjz\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.149521 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e7f67f62-3429-4d7f-aaa9-36914083d06e-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.149823 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.149716 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e7f67f62-3429-4d7f-aaa9-36914083d06e-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.149917 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.150028 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.150091 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e7f67f62-3429-4d7f-aaa9-36914083d06e-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.150181 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.150243 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwjjz\" (UniqueName: \"kubernetes.io/projected/e7f67f62-3429-4d7f-aaa9-36914083d06e-kube-api-access-rwjjz\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.150255 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e7f67f62-3429-4d7f-aaa9-36914083d06e-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.150301 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.150365 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.150430 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.150656 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/e7f67f62-3429-4d7f-aaa9-36914083d06e-builder-dockercfg-526rz-push\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.150726 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/e7f67f62-3429-4d7f-aaa9-36914083d06e-builder-dockercfg-526rz-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.151003 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.151084 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.151369 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.151640 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.151862 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.151984 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.152145 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.157216 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/e7f67f62-3429-4d7f-aaa9-36914083d06e-builder-dockercfg-526rz-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.166684 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/e7f67f62-3429-4d7f-aaa9-36914083d06e-builder-dockercfg-526rz-push\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.170557 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwjjz\" (UniqueName: \"kubernetes.io/projected/e7f67f62-3429-4d7f-aaa9-36914083d06e-kube-api-access-rwjjz\") pod \"smart-gateway-operator-1-build\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.218978 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.651332 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 20 00:23:09 crc kubenswrapper[4867]: W0320 00:23:09.659791 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7f67f62_3429_4d7f_aaa9_36914083d06e.slice/crio-95d5c7a17a8ae304c8ce57bb1b89ab75c8add0e2c02aa007e5a492ca4f54d5b5 WatchSource:0}: Error finding container 95d5c7a17a8ae304c8ce57bb1b89ab75c8add0e2c02aa007e5a492ca4f54d5b5: Status 404 returned error can't find the container with id 95d5c7a17a8ae304c8ce57bb1b89ab75c8add0e2c02aa007e5a492ca4f54d5b5 Mar 20 00:23:09 crc kubenswrapper[4867]: I0320 00:23:09.880946 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"e7f67f62-3429-4d7f-aaa9-36914083d06e","Type":"ContainerStarted","Data":"95d5c7a17a8ae304c8ce57bb1b89ab75c8add0e2c02aa007e5a492ca4f54d5b5"} Mar 20 00:23:10 crc kubenswrapper[4867]: I0320 00:23:10.910038 4867 generic.go:334] "Generic (PLEG): container finished" podID="e7f67f62-3429-4d7f-aaa9-36914083d06e" containerID="099ad6e8b2a24cc3224750e7470719793bf2246c6fe8cb43fe9a776c8619526c" exitCode=0 Mar 20 00:23:10 crc kubenswrapper[4867]: I0320 00:23:10.910091 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"e7f67f62-3429-4d7f-aaa9-36914083d06e","Type":"ContainerDied","Data":"099ad6e8b2a24cc3224750e7470719793bf2246c6fe8cb43fe9a776c8619526c"} Mar 20 00:23:11 crc kubenswrapper[4867]: I0320 00:23:11.917585 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"e7f67f62-3429-4d7f-aaa9-36914083d06e","Type":"ContainerStarted","Data":"5be12cd585a57919f4a26c92c091a688042f64fecd9d5d89cf5b6841675b803a"} Mar 20 00:23:11 crc kubenswrapper[4867]: I0320 00:23:11.958376 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=3.9583542019999998 podStartE2EDuration="3.958354202s" podCreationTimestamp="2026-03-20 00:23:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:23:11.947355866 +0000 UTC m=+1006.173893383" watchObservedRunningTime="2026-03-20 00:23:11.958354202 +0000 UTC m=+1006.184891739" Mar 20 00:23:19 crc kubenswrapper[4867]: I0320 00:23:19.516114 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 20 00:23:19 crc kubenswrapper[4867]: I0320 00:23:19.516930 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="e7f67f62-3429-4d7f-aaa9-36914083d06e" containerName="docker-build" containerID="cri-o://5be12cd585a57919f4a26c92c091a688042f64fecd9d5d89cf5b6841675b803a" gracePeriod=30 Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.363658 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_e7f67f62-3429-4d7f-aaa9-36914083d06e/docker-build/0.log" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.364436 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.507753 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e7f67f62-3429-4d7f-aaa9-36914083d06e-buildcachedir\") pod \"e7f67f62-3429-4d7f-aaa9-36914083d06e\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.507891 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-buildworkdir\") pod \"e7f67f62-3429-4d7f-aaa9-36914083d06e\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.507922 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-proxy-ca-bundles\") pod \"e7f67f62-3429-4d7f-aaa9-36914083d06e\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.507914 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f67f62-3429-4d7f-aaa9-36914083d06e-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e7f67f62-3429-4d7f-aaa9-36914083d06e" (UID: "e7f67f62-3429-4d7f-aaa9-36914083d06e"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.507945 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/e7f67f62-3429-4d7f-aaa9-36914083d06e-builder-dockercfg-526rz-pull\") pod \"e7f67f62-3429-4d7f-aaa9-36914083d06e\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.507965 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-container-storage-run\") pod \"e7f67f62-3429-4d7f-aaa9-36914083d06e\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.507989 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/e7f67f62-3429-4d7f-aaa9-36914083d06e-builder-dockercfg-526rz-push\") pod \"e7f67f62-3429-4d7f-aaa9-36914083d06e\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.508014 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-blob-cache\") pod \"e7f67f62-3429-4d7f-aaa9-36914083d06e\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.508036 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-ca-bundles\") pod \"e7f67f62-3429-4d7f-aaa9-36914083d06e\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.508051 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-container-storage-root\") pod \"e7f67f62-3429-4d7f-aaa9-36914083d06e\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.508068 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwjjz\" (UniqueName: \"kubernetes.io/projected/e7f67f62-3429-4d7f-aaa9-36914083d06e-kube-api-access-rwjjz\") pod \"e7f67f62-3429-4d7f-aaa9-36914083d06e\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.508093 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e7f67f62-3429-4d7f-aaa9-36914083d06e-node-pullsecrets\") pod \"e7f67f62-3429-4d7f-aaa9-36914083d06e\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.508112 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-system-configs\") pod \"e7f67f62-3429-4d7f-aaa9-36914083d06e\" (UID: \"e7f67f62-3429-4d7f-aaa9-36914083d06e\") " Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.508249 4867 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e7f67f62-3429-4d7f-aaa9-36914083d06e-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.508713 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e7f67f62-3429-4d7f-aaa9-36914083d06e" (UID: "e7f67f62-3429-4d7f-aaa9-36914083d06e"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.508718 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f67f62-3429-4d7f-aaa9-36914083d06e-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e7f67f62-3429-4d7f-aaa9-36914083d06e" (UID: "e7f67f62-3429-4d7f-aaa9-36914083d06e"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.508870 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e7f67f62-3429-4d7f-aaa9-36914083d06e" (UID: "e7f67f62-3429-4d7f-aaa9-36914083d06e"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.509301 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e7f67f62-3429-4d7f-aaa9-36914083d06e" (UID: "e7f67f62-3429-4d7f-aaa9-36914083d06e"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.509393 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e7f67f62-3429-4d7f-aaa9-36914083d06e" (UID: "e7f67f62-3429-4d7f-aaa9-36914083d06e"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.509667 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e7f67f62-3429-4d7f-aaa9-36914083d06e" (UID: "e7f67f62-3429-4d7f-aaa9-36914083d06e"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.513231 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7f67f62-3429-4d7f-aaa9-36914083d06e-kube-api-access-rwjjz" (OuterVolumeSpecName: "kube-api-access-rwjjz") pod "e7f67f62-3429-4d7f-aaa9-36914083d06e" (UID: "e7f67f62-3429-4d7f-aaa9-36914083d06e"). InnerVolumeSpecName "kube-api-access-rwjjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.513409 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f67f62-3429-4d7f-aaa9-36914083d06e-builder-dockercfg-526rz-pull" (OuterVolumeSpecName: "builder-dockercfg-526rz-pull") pod "e7f67f62-3429-4d7f-aaa9-36914083d06e" (UID: "e7f67f62-3429-4d7f-aaa9-36914083d06e"). InnerVolumeSpecName "builder-dockercfg-526rz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.515972 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f67f62-3429-4d7f-aaa9-36914083d06e-builder-dockercfg-526rz-push" (OuterVolumeSpecName: "builder-dockercfg-526rz-push") pod "e7f67f62-3429-4d7f-aaa9-36914083d06e" (UID: "e7f67f62-3429-4d7f-aaa9-36914083d06e"). InnerVolumeSpecName "builder-dockercfg-526rz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.609666 4867 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.609705 4867 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.609749 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/e7f67f62-3429-4d7f-aaa9-36914083d06e-builder-dockercfg-526rz-pull\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.609768 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.609782 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/e7f67f62-3429-4d7f-aaa9-36914083d06e-builder-dockercfg-526rz-push\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.609797 4867 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.609811 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwjjz\" (UniqueName: \"kubernetes.io/projected/e7f67f62-3429-4d7f-aaa9-36914083d06e-kube-api-access-rwjjz\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.609828 4867 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e7f67f62-3429-4d7f-aaa9-36914083d06e-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.609845 4867 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.678736 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e7f67f62-3429-4d7f-aaa9-36914083d06e" (UID: "e7f67f62-3429-4d7f-aaa9-36914083d06e"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.710436 4867 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.919900 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e7f67f62-3429-4d7f-aaa9-36914083d06e" (UID: "e7f67f62-3429-4d7f-aaa9-36914083d06e"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.979153 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_e7f67f62-3429-4d7f-aaa9-36914083d06e/docker-build/0.log" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.979708 4867 generic.go:334] "Generic (PLEG): container finished" podID="e7f67f62-3429-4d7f-aaa9-36914083d06e" containerID="5be12cd585a57919f4a26c92c091a688042f64fecd9d5d89cf5b6841675b803a" exitCode=1 Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.979748 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"e7f67f62-3429-4d7f-aaa9-36914083d06e","Type":"ContainerDied","Data":"5be12cd585a57919f4a26c92c091a688042f64fecd9d5d89cf5b6841675b803a"} Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.979776 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"e7f67f62-3429-4d7f-aaa9-36914083d06e","Type":"ContainerDied","Data":"95d5c7a17a8ae304c8ce57bb1b89ab75c8add0e2c02aa007e5a492ca4f54d5b5"} Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.979793 4867 scope.go:117] "RemoveContainer" containerID="5be12cd585a57919f4a26c92c091a688042f64fecd9d5d89cf5b6841675b803a" Mar 20 00:23:20 crc kubenswrapper[4867]: I0320 00:23:20.979877 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.014390 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e7f67f62-3429-4d7f-aaa9-36914083d06e-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.015711 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.030601 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.058504 4867 scope.go:117] "RemoveContainer" containerID="099ad6e8b2a24cc3224750e7470719793bf2246c6fe8cb43fe9a776c8619526c" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.092822 4867 scope.go:117] "RemoveContainer" containerID="5be12cd585a57919f4a26c92c091a688042f64fecd9d5d89cf5b6841675b803a" Mar 20 00:23:21 crc kubenswrapper[4867]: E0320 00:23:21.093209 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5be12cd585a57919f4a26c92c091a688042f64fecd9d5d89cf5b6841675b803a\": container with ID starting with 5be12cd585a57919f4a26c92c091a688042f64fecd9d5d89cf5b6841675b803a not found: ID does not exist" containerID="5be12cd585a57919f4a26c92c091a688042f64fecd9d5d89cf5b6841675b803a" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.093250 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5be12cd585a57919f4a26c92c091a688042f64fecd9d5d89cf5b6841675b803a"} err="failed to get container status \"5be12cd585a57919f4a26c92c091a688042f64fecd9d5d89cf5b6841675b803a\": rpc error: code = NotFound desc = could not find container \"5be12cd585a57919f4a26c92c091a688042f64fecd9d5d89cf5b6841675b803a\": container with ID starting with 5be12cd585a57919f4a26c92c091a688042f64fecd9d5d89cf5b6841675b803a not found: ID does not exist" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.093276 4867 scope.go:117] "RemoveContainer" containerID="099ad6e8b2a24cc3224750e7470719793bf2246c6fe8cb43fe9a776c8619526c" Mar 20 00:23:21 crc kubenswrapper[4867]: E0320 00:23:21.093644 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"099ad6e8b2a24cc3224750e7470719793bf2246c6fe8cb43fe9a776c8619526c\": container with ID starting with 099ad6e8b2a24cc3224750e7470719793bf2246c6fe8cb43fe9a776c8619526c not found: ID does not exist" containerID="099ad6e8b2a24cc3224750e7470719793bf2246c6fe8cb43fe9a776c8619526c" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.093670 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099ad6e8b2a24cc3224750e7470719793bf2246c6fe8cb43fe9a776c8619526c"} err="failed to get container status \"099ad6e8b2a24cc3224750e7470719793bf2246c6fe8cb43fe9a776c8619526c\": rpc error: code = NotFound desc = could not find container \"099ad6e8b2a24cc3224750e7470719793bf2246c6fe8cb43fe9a776c8619526c\": container with ID starting with 099ad6e8b2a24cc3224750e7470719793bf2246c6fe8cb43fe9a776c8619526c not found: ID does not exist" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.123250 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 20 00:23:21 crc kubenswrapper[4867]: E0320 00:23:21.123528 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f67f62-3429-4d7f-aaa9-36914083d06e" containerName="manage-dockerfile" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.123542 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f67f62-3429-4d7f-aaa9-36914083d06e" containerName="manage-dockerfile" Mar 20 00:23:21 crc kubenswrapper[4867]: E0320 00:23:21.123561 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f67f62-3429-4d7f-aaa9-36914083d06e" containerName="docker-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.123627 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f67f62-3429-4d7f-aaa9-36914083d06e" containerName="docker-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.123760 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f67f62-3429-4d7f-aaa9-36914083d06e" containerName="docker-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.124605 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.126517 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.126815 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-526rz" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.127171 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.127390 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.146995 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.319676 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/70187bed-f441-4814-9002-8df2d36a5afd-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.319761 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.319819 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70187bed-f441-4814-9002-8df2d36a5afd-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.319859 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.319945 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70187bed-f441-4814-9002-8df2d36a5afd-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.319980 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/70187bed-f441-4814-9002-8df2d36a5afd-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.320022 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.320059 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/70187bed-f441-4814-9002-8df2d36a5afd-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.320092 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jws5\" (UniqueName: \"kubernetes.io/projected/70187bed-f441-4814-9002-8df2d36a5afd-kube-api-access-2jws5\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.320128 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/70187bed-f441-4814-9002-8df2d36a5afd-builder-dockercfg-526rz-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.320166 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.320266 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/70187bed-f441-4814-9002-8df2d36a5afd-builder-dockercfg-526rz-push\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.421243 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.421355 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70187bed-f441-4814-9002-8df2d36a5afd-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.421398 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/70187bed-f441-4814-9002-8df2d36a5afd-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.421442 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.421484 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/70187bed-f441-4814-9002-8df2d36a5afd-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.421592 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jws5\" (UniqueName: \"kubernetes.io/projected/70187bed-f441-4814-9002-8df2d36a5afd-kube-api-access-2jws5\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.421655 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/70187bed-f441-4814-9002-8df2d36a5afd-builder-dockercfg-526rz-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.421667 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/70187bed-f441-4814-9002-8df2d36a5afd-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.421706 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.421764 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/70187bed-f441-4814-9002-8df2d36a5afd-builder-dockercfg-526rz-push\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.421817 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/70187bed-f441-4814-9002-8df2d36a5afd-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.421870 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.421935 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70187bed-f441-4814-9002-8df2d36a5afd-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.421997 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/70187bed-f441-4814-9002-8df2d36a5afd-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.422212 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70187bed-f441-4814-9002-8df2d36a5afd-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.422286 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.422745 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.423060 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.423212 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/70187bed-f441-4814-9002-8df2d36a5afd-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.423458 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.425322 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/70187bed-f441-4814-9002-8df2d36a5afd-builder-dockercfg-526rz-push\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.433631 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70187bed-f441-4814-9002-8df2d36a5afd-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.436193 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/70187bed-f441-4814-9002-8df2d36a5afd-builder-dockercfg-526rz-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.437868 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jws5\" (UniqueName: \"kubernetes.io/projected/70187bed-f441-4814-9002-8df2d36a5afd-kube-api-access-2jws5\") pod \"smart-gateway-operator-2-build\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.444334 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.686141 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 20 00:23:21 crc kubenswrapper[4867]: I0320 00:23:21.986145 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"70187bed-f441-4814-9002-8df2d36a5afd","Type":"ContainerStarted","Data":"1cb8bcff373f0f31387c199e6c34b1cb4566da24f3c07132e427c80e5d0c5a07"} Mar 20 00:23:22 crc kubenswrapper[4867]: I0320 00:23:22.432672 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7f67f62-3429-4d7f-aaa9-36914083d06e" path="/var/lib/kubelet/pods/e7f67f62-3429-4d7f-aaa9-36914083d06e/volumes" Mar 20 00:23:22 crc kubenswrapper[4867]: I0320 00:23:22.994717 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"70187bed-f441-4814-9002-8df2d36a5afd","Type":"ContainerStarted","Data":"9e41d4b8c0c456f80d806fe68b8e39ac3cb7a5f4cd678e6d00bb5823c8c04647"} Mar 20 00:23:24 crc kubenswrapper[4867]: I0320 00:23:24.005898 4867 generic.go:334] "Generic (PLEG): container finished" podID="70187bed-f441-4814-9002-8df2d36a5afd" containerID="9e41d4b8c0c456f80d806fe68b8e39ac3cb7a5f4cd678e6d00bb5823c8c04647" exitCode=0 Mar 20 00:23:24 crc kubenswrapper[4867]: I0320 00:23:24.005938 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"70187bed-f441-4814-9002-8df2d36a5afd","Type":"ContainerDied","Data":"9e41d4b8c0c456f80d806fe68b8e39ac3cb7a5f4cd678e6d00bb5823c8c04647"} Mar 20 00:23:25 crc kubenswrapper[4867]: I0320 00:23:25.017277 4867 generic.go:334] "Generic (PLEG): container finished" podID="70187bed-f441-4814-9002-8df2d36a5afd" containerID="9f4e026ea592c3e1496f509ba2df595346c5c67e42a32f846521d51c07087e7d" exitCode=0 Mar 20 00:23:25 crc kubenswrapper[4867]: I0320 00:23:25.017462 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"70187bed-f441-4814-9002-8df2d36a5afd","Type":"ContainerDied","Data":"9f4e026ea592c3e1496f509ba2df595346c5c67e42a32f846521d51c07087e7d"} Mar 20 00:23:25 crc kubenswrapper[4867]: I0320 00:23:25.054832 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_70187bed-f441-4814-9002-8df2d36a5afd/manage-dockerfile/0.log" Mar 20 00:23:26 crc kubenswrapper[4867]: I0320 00:23:26.026869 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"70187bed-f441-4814-9002-8df2d36a5afd","Type":"ContainerStarted","Data":"1dec1291223a0ca2a64d544d482a77ea81702de9cf619e8708b53296463d4677"} Mar 20 00:23:26 crc kubenswrapper[4867]: I0320 00:23:26.060243 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=5.060226511 podStartE2EDuration="5.060226511s" podCreationTimestamp="2026-03-20 00:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:23:26.054107902 +0000 UTC m=+1020.280645479" watchObservedRunningTime="2026-03-20 00:23:26.060226511 +0000 UTC m=+1020.286764028" Mar 20 00:23:37 crc kubenswrapper[4867]: I0320 00:23:37.605664 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bf5f7"] Mar 20 00:23:37 crc kubenswrapper[4867]: I0320 00:23:37.607868 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bf5f7" Mar 20 00:23:37 crc kubenswrapper[4867]: I0320 00:23:37.614171 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bf5f7"] Mar 20 00:23:37 crc kubenswrapper[4867]: I0320 00:23:37.641638 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c114d030-d054-44a6-b10b-1ad432c2f9d7-catalog-content\") pod \"community-operators-bf5f7\" (UID: \"c114d030-d054-44a6-b10b-1ad432c2f9d7\") " pod="openshift-marketplace/community-operators-bf5f7" Mar 20 00:23:37 crc kubenswrapper[4867]: I0320 00:23:37.641911 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gswkh\" (UniqueName: \"kubernetes.io/projected/c114d030-d054-44a6-b10b-1ad432c2f9d7-kube-api-access-gswkh\") pod \"community-operators-bf5f7\" (UID: \"c114d030-d054-44a6-b10b-1ad432c2f9d7\") " pod="openshift-marketplace/community-operators-bf5f7" Mar 20 00:23:37 crc kubenswrapper[4867]: I0320 00:23:37.642015 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c114d030-d054-44a6-b10b-1ad432c2f9d7-utilities\") pod \"community-operators-bf5f7\" (UID: \"c114d030-d054-44a6-b10b-1ad432c2f9d7\") " pod="openshift-marketplace/community-operators-bf5f7" Mar 20 00:23:37 crc kubenswrapper[4867]: I0320 00:23:37.743672 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c114d030-d054-44a6-b10b-1ad432c2f9d7-catalog-content\") pod \"community-operators-bf5f7\" (UID: \"c114d030-d054-44a6-b10b-1ad432c2f9d7\") " pod="openshift-marketplace/community-operators-bf5f7" Mar 20 00:23:37 crc kubenswrapper[4867]: I0320 00:23:37.743727 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gswkh\" (UniqueName: \"kubernetes.io/projected/c114d030-d054-44a6-b10b-1ad432c2f9d7-kube-api-access-gswkh\") pod \"community-operators-bf5f7\" (UID: \"c114d030-d054-44a6-b10b-1ad432c2f9d7\") " pod="openshift-marketplace/community-operators-bf5f7" Mar 20 00:23:37 crc kubenswrapper[4867]: I0320 00:23:37.743763 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c114d030-d054-44a6-b10b-1ad432c2f9d7-utilities\") pod \"community-operators-bf5f7\" (UID: \"c114d030-d054-44a6-b10b-1ad432c2f9d7\") " pod="openshift-marketplace/community-operators-bf5f7" Mar 20 00:23:37 crc kubenswrapper[4867]: I0320 00:23:37.744207 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c114d030-d054-44a6-b10b-1ad432c2f9d7-catalog-content\") pod \"community-operators-bf5f7\" (UID: \"c114d030-d054-44a6-b10b-1ad432c2f9d7\") " pod="openshift-marketplace/community-operators-bf5f7" Mar 20 00:23:37 crc kubenswrapper[4867]: I0320 00:23:37.744249 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c114d030-d054-44a6-b10b-1ad432c2f9d7-utilities\") pod \"community-operators-bf5f7\" (UID: \"c114d030-d054-44a6-b10b-1ad432c2f9d7\") " pod="openshift-marketplace/community-operators-bf5f7" Mar 20 00:23:37 crc kubenswrapper[4867]: I0320 00:23:37.763020 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gswkh\" (UniqueName: \"kubernetes.io/projected/c114d030-d054-44a6-b10b-1ad432c2f9d7-kube-api-access-gswkh\") pod \"community-operators-bf5f7\" (UID: \"c114d030-d054-44a6-b10b-1ad432c2f9d7\") " pod="openshift-marketplace/community-operators-bf5f7" Mar 20 00:23:37 crc kubenswrapper[4867]: I0320 00:23:37.930372 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bf5f7" Mar 20 00:23:38 crc kubenswrapper[4867]: I0320 00:23:38.206774 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bf5f7"] Mar 20 00:23:39 crc kubenswrapper[4867]: I0320 00:23:39.120381 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bf5f7" event={"ID":"c114d030-d054-44a6-b10b-1ad432c2f9d7","Type":"ContainerStarted","Data":"ab049a8c54842ce7380dd35c2cb0a0d57645c6731b16e75c20ac06dc211a4ee7"} Mar 20 00:23:43 crc kubenswrapper[4867]: I0320 00:23:43.189315 4867 generic.go:334] "Generic (PLEG): container finished" podID="c114d030-d054-44a6-b10b-1ad432c2f9d7" containerID="3e748aa238b7a236e0016f9c84f07e44e61995dc55ca94fef7fd3233f076a723" exitCode=0 Mar 20 00:23:43 crc kubenswrapper[4867]: I0320 00:23:43.189368 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bf5f7" event={"ID":"c114d030-d054-44a6-b10b-1ad432c2f9d7","Type":"ContainerDied","Data":"3e748aa238b7a236e0016f9c84f07e44e61995dc55ca94fef7fd3233f076a723"} Mar 20 00:23:46 crc kubenswrapper[4867]: I0320 00:23:46.211267 4867 generic.go:334] "Generic (PLEG): container finished" podID="c114d030-d054-44a6-b10b-1ad432c2f9d7" containerID="c4e4185893bb0e525eead2ad1ca54299d1b2e53a6eae94cbdb531a033ced3da3" exitCode=0 Mar 20 00:23:46 crc kubenswrapper[4867]: I0320 00:23:46.211347 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bf5f7" event={"ID":"c114d030-d054-44a6-b10b-1ad432c2f9d7","Type":"ContainerDied","Data":"c4e4185893bb0e525eead2ad1ca54299d1b2e53a6eae94cbdb531a033ced3da3"} Mar 20 00:23:49 crc kubenswrapper[4867]: I0320 00:23:49.233375 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bf5f7" event={"ID":"c114d030-d054-44a6-b10b-1ad432c2f9d7","Type":"ContainerStarted","Data":"9b7833423aac067b2ee61df0a40b677c7176a84635560f8daac8e7ecea2f456e"} Mar 20 00:23:50 crc kubenswrapper[4867]: I0320 00:23:50.260538 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bf5f7" podStartSLOduration=7.477782869 podStartE2EDuration="13.260516267s" podCreationTimestamp="2026-03-20 00:23:37 +0000 UTC" firstStartedPulling="2026-03-20 00:23:43.19156672 +0000 UTC m=+1037.418104297" lastFinishedPulling="2026-03-20 00:23:48.974300128 +0000 UTC m=+1043.200837695" observedRunningTime="2026-03-20 00:23:50.256627556 +0000 UTC m=+1044.483165093" watchObservedRunningTime="2026-03-20 00:23:50.260516267 +0000 UTC m=+1044.487053804" Mar 20 00:23:57 crc kubenswrapper[4867]: I0320 00:23:57.930973 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bf5f7" Mar 20 00:23:57 crc kubenswrapper[4867]: I0320 00:23:57.931604 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bf5f7" Mar 20 00:23:57 crc kubenswrapper[4867]: I0320 00:23:57.980061 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bf5f7" Mar 20 00:23:58 crc kubenswrapper[4867]: I0320 00:23:58.358649 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bf5f7" Mar 20 00:23:58 crc kubenswrapper[4867]: I0320 00:23:58.418796 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bf5f7"] Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.145579 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566104-97tkw"] Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.147598 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566104-97tkw" Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.151276 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.151695 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.152105 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lxzm5" Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.158988 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdzd5\" (UniqueName: \"kubernetes.io/projected/6c73fdb6-10f4-403a-b900-40f895b971d4-kube-api-access-fdzd5\") pod \"auto-csr-approver-29566104-97tkw\" (UID: \"6c73fdb6-10f4-403a-b900-40f895b971d4\") " pod="openshift-infra/auto-csr-approver-29566104-97tkw" Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.164152 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566104-97tkw"] Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.260394 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdzd5\" (UniqueName: \"kubernetes.io/projected/6c73fdb6-10f4-403a-b900-40f895b971d4-kube-api-access-fdzd5\") pod \"auto-csr-approver-29566104-97tkw\" (UID: \"6c73fdb6-10f4-403a-b900-40f895b971d4\") " pod="openshift-infra/auto-csr-approver-29566104-97tkw" Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.297133 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdzd5\" (UniqueName: \"kubernetes.io/projected/6c73fdb6-10f4-403a-b900-40f895b971d4-kube-api-access-fdzd5\") pod \"auto-csr-approver-29566104-97tkw\" (UID: \"6c73fdb6-10f4-403a-b900-40f895b971d4\") " pod="openshift-infra/auto-csr-approver-29566104-97tkw" Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.304662 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bf5f7" podUID="c114d030-d054-44a6-b10b-1ad432c2f9d7" containerName="registry-server" containerID="cri-o://9b7833423aac067b2ee61df0a40b677c7176a84635560f8daac8e7ecea2f456e" gracePeriod=2 Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.480976 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566104-97tkw" Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.665948 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bf5f7" Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.766350 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gswkh\" (UniqueName: \"kubernetes.io/projected/c114d030-d054-44a6-b10b-1ad432c2f9d7-kube-api-access-gswkh\") pod \"c114d030-d054-44a6-b10b-1ad432c2f9d7\" (UID: \"c114d030-d054-44a6-b10b-1ad432c2f9d7\") " Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.766420 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c114d030-d054-44a6-b10b-1ad432c2f9d7-utilities\") pod \"c114d030-d054-44a6-b10b-1ad432c2f9d7\" (UID: \"c114d030-d054-44a6-b10b-1ad432c2f9d7\") " Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.766486 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c114d030-d054-44a6-b10b-1ad432c2f9d7-catalog-content\") pod \"c114d030-d054-44a6-b10b-1ad432c2f9d7\" (UID: \"c114d030-d054-44a6-b10b-1ad432c2f9d7\") " Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.770557 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c114d030-d054-44a6-b10b-1ad432c2f9d7-kube-api-access-gswkh" (OuterVolumeSpecName: "kube-api-access-gswkh") pod "c114d030-d054-44a6-b10b-1ad432c2f9d7" (UID: "c114d030-d054-44a6-b10b-1ad432c2f9d7"). InnerVolumeSpecName "kube-api-access-gswkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.778781 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c114d030-d054-44a6-b10b-1ad432c2f9d7-utilities" (OuterVolumeSpecName: "utilities") pod "c114d030-d054-44a6-b10b-1ad432c2f9d7" (UID: "c114d030-d054-44a6-b10b-1ad432c2f9d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.815746 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c114d030-d054-44a6-b10b-1ad432c2f9d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c114d030-d054-44a6-b10b-1ad432c2f9d7" (UID: "c114d030-d054-44a6-b10b-1ad432c2f9d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.867443 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c114d030-d054-44a6-b10b-1ad432c2f9d7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.867471 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gswkh\" (UniqueName: \"kubernetes.io/projected/c114d030-d054-44a6-b10b-1ad432c2f9d7-kube-api-access-gswkh\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.867481 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c114d030-d054-44a6-b10b-1ad432c2f9d7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:00 crc kubenswrapper[4867]: I0320 00:24:00.887231 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566104-97tkw"] Mar 20 00:24:00 crc kubenswrapper[4867]: W0320 00:24:00.897795 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c73fdb6_10f4_403a_b900_40f895b971d4.slice/crio-233ccdbd9027e7bd7d57ea1b8f11464f6edda7738494063c396e47df8b6bb630 WatchSource:0}: Error finding container 233ccdbd9027e7bd7d57ea1b8f11464f6edda7738494063c396e47df8b6bb630: Status 404 returned error can't find the container with id 233ccdbd9027e7bd7d57ea1b8f11464f6edda7738494063c396e47df8b6bb630 Mar 20 00:24:01 crc kubenswrapper[4867]: I0320 00:24:01.316015 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566104-97tkw" event={"ID":"6c73fdb6-10f4-403a-b900-40f895b971d4","Type":"ContainerStarted","Data":"233ccdbd9027e7bd7d57ea1b8f11464f6edda7738494063c396e47df8b6bb630"} Mar 20 00:24:01 crc kubenswrapper[4867]: I0320 00:24:01.318710 4867 generic.go:334] "Generic (PLEG): container finished" podID="c114d030-d054-44a6-b10b-1ad432c2f9d7" containerID="9b7833423aac067b2ee61df0a40b677c7176a84635560f8daac8e7ecea2f456e" exitCode=0 Mar 20 00:24:01 crc kubenswrapper[4867]: I0320 00:24:01.318750 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bf5f7" event={"ID":"c114d030-d054-44a6-b10b-1ad432c2f9d7","Type":"ContainerDied","Data":"9b7833423aac067b2ee61df0a40b677c7176a84635560f8daac8e7ecea2f456e"} Mar 20 00:24:01 crc kubenswrapper[4867]: I0320 00:24:01.318773 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bf5f7" event={"ID":"c114d030-d054-44a6-b10b-1ad432c2f9d7","Type":"ContainerDied","Data":"ab049a8c54842ce7380dd35c2cb0a0d57645c6731b16e75c20ac06dc211a4ee7"} Mar 20 00:24:01 crc kubenswrapper[4867]: I0320 00:24:01.318792 4867 scope.go:117] "RemoveContainer" containerID="9b7833423aac067b2ee61df0a40b677c7176a84635560f8daac8e7ecea2f456e" Mar 20 00:24:01 crc kubenswrapper[4867]: I0320 00:24:01.318942 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bf5f7" Mar 20 00:24:01 crc kubenswrapper[4867]: I0320 00:24:01.338139 4867 scope.go:117] "RemoveContainer" containerID="c4e4185893bb0e525eead2ad1ca54299d1b2e53a6eae94cbdb531a033ced3da3" Mar 20 00:24:01 crc kubenswrapper[4867]: I0320 00:24:01.359909 4867 scope.go:117] "RemoveContainer" containerID="3e748aa238b7a236e0016f9c84f07e44e61995dc55ca94fef7fd3233f076a723" Mar 20 00:24:01 crc kubenswrapper[4867]: I0320 00:24:01.363357 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bf5f7"] Mar 20 00:24:01 crc kubenswrapper[4867]: I0320 00:24:01.381022 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bf5f7"] Mar 20 00:24:01 crc kubenswrapper[4867]: I0320 00:24:01.392912 4867 scope.go:117] "RemoveContainer" containerID="9b7833423aac067b2ee61df0a40b677c7176a84635560f8daac8e7ecea2f456e" Mar 20 00:24:01 crc kubenswrapper[4867]: E0320 00:24:01.393524 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b7833423aac067b2ee61df0a40b677c7176a84635560f8daac8e7ecea2f456e\": container with ID starting with 9b7833423aac067b2ee61df0a40b677c7176a84635560f8daac8e7ecea2f456e not found: ID does not exist" containerID="9b7833423aac067b2ee61df0a40b677c7176a84635560f8daac8e7ecea2f456e" Mar 20 00:24:01 crc kubenswrapper[4867]: I0320 00:24:01.393574 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b7833423aac067b2ee61df0a40b677c7176a84635560f8daac8e7ecea2f456e"} err="failed to get container status \"9b7833423aac067b2ee61df0a40b677c7176a84635560f8daac8e7ecea2f456e\": rpc error: code = NotFound desc = could not find container \"9b7833423aac067b2ee61df0a40b677c7176a84635560f8daac8e7ecea2f456e\": container with ID starting with 9b7833423aac067b2ee61df0a40b677c7176a84635560f8daac8e7ecea2f456e not found: ID does not exist" Mar 20 00:24:01 crc kubenswrapper[4867]: I0320 00:24:01.393604 4867 scope.go:117] "RemoveContainer" containerID="c4e4185893bb0e525eead2ad1ca54299d1b2e53a6eae94cbdb531a033ced3da3" Mar 20 00:24:01 crc kubenswrapper[4867]: E0320 00:24:01.393936 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4e4185893bb0e525eead2ad1ca54299d1b2e53a6eae94cbdb531a033ced3da3\": container with ID starting with c4e4185893bb0e525eead2ad1ca54299d1b2e53a6eae94cbdb531a033ced3da3 not found: ID does not exist" containerID="c4e4185893bb0e525eead2ad1ca54299d1b2e53a6eae94cbdb531a033ced3da3" Mar 20 00:24:01 crc kubenswrapper[4867]: I0320 00:24:01.393978 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4e4185893bb0e525eead2ad1ca54299d1b2e53a6eae94cbdb531a033ced3da3"} err="failed to get container status \"c4e4185893bb0e525eead2ad1ca54299d1b2e53a6eae94cbdb531a033ced3da3\": rpc error: code = NotFound desc = could not find container \"c4e4185893bb0e525eead2ad1ca54299d1b2e53a6eae94cbdb531a033ced3da3\": container with ID starting with c4e4185893bb0e525eead2ad1ca54299d1b2e53a6eae94cbdb531a033ced3da3 not found: ID does not exist" Mar 20 00:24:01 crc kubenswrapper[4867]: I0320 00:24:01.394001 4867 scope.go:117] "RemoveContainer" containerID="3e748aa238b7a236e0016f9c84f07e44e61995dc55ca94fef7fd3233f076a723" Mar 20 00:24:01 crc kubenswrapper[4867]: E0320 00:24:01.394281 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e748aa238b7a236e0016f9c84f07e44e61995dc55ca94fef7fd3233f076a723\": container with ID starting with 3e748aa238b7a236e0016f9c84f07e44e61995dc55ca94fef7fd3233f076a723 not found: ID does not exist" containerID="3e748aa238b7a236e0016f9c84f07e44e61995dc55ca94fef7fd3233f076a723" Mar 20 00:24:01 crc kubenswrapper[4867]: I0320 00:24:01.394318 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e748aa238b7a236e0016f9c84f07e44e61995dc55ca94fef7fd3233f076a723"} err="failed to get container status \"3e748aa238b7a236e0016f9c84f07e44e61995dc55ca94fef7fd3233f076a723\": rpc error: code = NotFound desc = could not find container \"3e748aa238b7a236e0016f9c84f07e44e61995dc55ca94fef7fd3233f076a723\": container with ID starting with 3e748aa238b7a236e0016f9c84f07e44e61995dc55ca94fef7fd3233f076a723 not found: ID does not exist" Mar 20 00:24:02 crc kubenswrapper[4867]: I0320 00:24:02.348796 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566104-97tkw" event={"ID":"6c73fdb6-10f4-403a-b900-40f895b971d4","Type":"ContainerStarted","Data":"b795522ad0f5eed5adb8d6a5e87115e9282d1a3c8488e314fcc939a2405a5dbb"} Mar 20 00:24:02 crc kubenswrapper[4867]: I0320 00:24:02.375130 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566104-97tkw" podStartSLOduration=1.410621417 podStartE2EDuration="2.375103412s" podCreationTimestamp="2026-03-20 00:24:00 +0000 UTC" firstStartedPulling="2026-03-20 00:24:00.900020952 +0000 UTC m=+1055.126558469" lastFinishedPulling="2026-03-20 00:24:01.864502937 +0000 UTC m=+1056.091040464" observedRunningTime="2026-03-20 00:24:02.368554641 +0000 UTC m=+1056.595092178" watchObservedRunningTime="2026-03-20 00:24:02.375103412 +0000 UTC m=+1056.601640959" Mar 20 00:24:02 crc kubenswrapper[4867]: I0320 00:24:02.429630 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c114d030-d054-44a6-b10b-1ad432c2f9d7" path="/var/lib/kubelet/pods/c114d030-d054-44a6-b10b-1ad432c2f9d7/volumes" Mar 20 00:24:03 crc kubenswrapper[4867]: I0320 00:24:03.358833 4867 generic.go:334] "Generic (PLEG): container finished" podID="6c73fdb6-10f4-403a-b900-40f895b971d4" containerID="b795522ad0f5eed5adb8d6a5e87115e9282d1a3c8488e314fcc939a2405a5dbb" exitCode=0 Mar 20 00:24:03 crc kubenswrapper[4867]: I0320 00:24:03.358879 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566104-97tkw" event={"ID":"6c73fdb6-10f4-403a-b900-40f895b971d4","Type":"ContainerDied","Data":"b795522ad0f5eed5adb8d6a5e87115e9282d1a3c8488e314fcc939a2405a5dbb"} Mar 20 00:24:04 crc kubenswrapper[4867]: I0320 00:24:04.700018 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566104-97tkw" Mar 20 00:24:04 crc kubenswrapper[4867]: I0320 00:24:04.818673 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdzd5\" (UniqueName: \"kubernetes.io/projected/6c73fdb6-10f4-403a-b900-40f895b971d4-kube-api-access-fdzd5\") pod \"6c73fdb6-10f4-403a-b900-40f895b971d4\" (UID: \"6c73fdb6-10f4-403a-b900-40f895b971d4\") " Mar 20 00:24:04 crc kubenswrapper[4867]: I0320 00:24:04.824537 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c73fdb6-10f4-403a-b900-40f895b971d4-kube-api-access-fdzd5" (OuterVolumeSpecName: "kube-api-access-fdzd5") pod "6c73fdb6-10f4-403a-b900-40f895b971d4" (UID: "6c73fdb6-10f4-403a-b900-40f895b971d4"). InnerVolumeSpecName "kube-api-access-fdzd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:24:04 crc kubenswrapper[4867]: I0320 00:24:04.920139 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdzd5\" (UniqueName: \"kubernetes.io/projected/6c73fdb6-10f4-403a-b900-40f895b971d4-kube-api-access-fdzd5\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:05 crc kubenswrapper[4867]: I0320 00:24:05.375269 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566104-97tkw" event={"ID":"6c73fdb6-10f4-403a-b900-40f895b971d4","Type":"ContainerDied","Data":"233ccdbd9027e7bd7d57ea1b8f11464f6edda7738494063c396e47df8b6bb630"} Mar 20 00:24:05 crc kubenswrapper[4867]: I0320 00:24:05.375631 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="233ccdbd9027e7bd7d57ea1b8f11464f6edda7738494063c396e47df8b6bb630" Mar 20 00:24:05 crc kubenswrapper[4867]: I0320 00:24:05.375545 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566104-97tkw" Mar 20 00:24:05 crc kubenswrapper[4867]: I0320 00:24:05.435869 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566098-66gdx"] Mar 20 00:24:05 crc kubenswrapper[4867]: I0320 00:24:05.440441 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566098-66gdx"] Mar 20 00:24:06 crc kubenswrapper[4867]: I0320 00:24:06.432825 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e3cfe9-a819-4d53-a513-376208ec2884" path="/var/lib/kubelet/pods/26e3cfe9-a819-4d53-a513-376208ec2884/volumes" Mar 20 00:24:18 crc kubenswrapper[4867]: I0320 00:24:18.859943 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:24:18 crc kubenswrapper[4867]: I0320 00:24:18.860766 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:24:38 crc kubenswrapper[4867]: I0320 00:24:38.613928 4867 generic.go:334] "Generic (PLEG): container finished" podID="70187bed-f441-4814-9002-8df2d36a5afd" containerID="1dec1291223a0ca2a64d544d482a77ea81702de9cf619e8708b53296463d4677" exitCode=0 Mar 20 00:24:38 crc kubenswrapper[4867]: I0320 00:24:38.614150 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"70187bed-f441-4814-9002-8df2d36a5afd","Type":"ContainerDied","Data":"1dec1291223a0ca2a64d544d482a77ea81702de9cf619e8708b53296463d4677"} Mar 20 00:24:39 crc kubenswrapper[4867]: I0320 00:24:39.940950 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.032341 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-container-storage-run\") pod \"70187bed-f441-4814-9002-8df2d36a5afd\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.032713 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70187bed-f441-4814-9002-8df2d36a5afd-build-ca-bundles\") pod \"70187bed-f441-4814-9002-8df2d36a5afd\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.032743 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/70187bed-f441-4814-9002-8df2d36a5afd-builder-dockercfg-526rz-pull\") pod \"70187bed-f441-4814-9002-8df2d36a5afd\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.032778 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/70187bed-f441-4814-9002-8df2d36a5afd-buildcachedir\") pod \"70187bed-f441-4814-9002-8df2d36a5afd\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.032807 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/70187bed-f441-4814-9002-8df2d36a5afd-build-system-configs\") pod \"70187bed-f441-4814-9002-8df2d36a5afd\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.032839 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/70187bed-f441-4814-9002-8df2d36a5afd-node-pullsecrets\") pod \"70187bed-f441-4814-9002-8df2d36a5afd\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.032860 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/70187bed-f441-4814-9002-8df2d36a5afd-builder-dockercfg-526rz-push\") pod \"70187bed-f441-4814-9002-8df2d36a5afd\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.032877 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70187bed-f441-4814-9002-8df2d36a5afd-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "70187bed-f441-4814-9002-8df2d36a5afd" (UID: "70187bed-f441-4814-9002-8df2d36a5afd"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.032891 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-buildworkdir\") pod \"70187bed-f441-4814-9002-8df2d36a5afd\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.032942 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-build-blob-cache\") pod \"70187bed-f441-4814-9002-8df2d36a5afd\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.032999 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jws5\" (UniqueName: \"kubernetes.io/projected/70187bed-f441-4814-9002-8df2d36a5afd-kube-api-access-2jws5\") pod \"70187bed-f441-4814-9002-8df2d36a5afd\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.033032 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-container-storage-root\") pod \"70187bed-f441-4814-9002-8df2d36a5afd\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.033075 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70187bed-f441-4814-9002-8df2d36a5afd-build-proxy-ca-bundles\") pod \"70187bed-f441-4814-9002-8df2d36a5afd\" (UID: \"70187bed-f441-4814-9002-8df2d36a5afd\") " Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.033365 4867 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/70187bed-f441-4814-9002-8df2d36a5afd-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.033357 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70187bed-f441-4814-9002-8df2d36a5afd-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "70187bed-f441-4814-9002-8df2d36a5afd" (UID: "70187bed-f441-4814-9002-8df2d36a5afd"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.033980 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70187bed-f441-4814-9002-8df2d36a5afd-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "70187bed-f441-4814-9002-8df2d36a5afd" (UID: "70187bed-f441-4814-9002-8df2d36a5afd"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.034267 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70187bed-f441-4814-9002-8df2d36a5afd-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "70187bed-f441-4814-9002-8df2d36a5afd" (UID: "70187bed-f441-4814-9002-8df2d36a5afd"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.034323 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "70187bed-f441-4814-9002-8df2d36a5afd" (UID: "70187bed-f441-4814-9002-8df2d36a5afd"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.035096 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70187bed-f441-4814-9002-8df2d36a5afd-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "70187bed-f441-4814-9002-8df2d36a5afd" (UID: "70187bed-f441-4814-9002-8df2d36a5afd"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.037448 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "70187bed-f441-4814-9002-8df2d36a5afd" (UID: "70187bed-f441-4814-9002-8df2d36a5afd"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.038705 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70187bed-f441-4814-9002-8df2d36a5afd-builder-dockercfg-526rz-push" (OuterVolumeSpecName: "builder-dockercfg-526rz-push") pod "70187bed-f441-4814-9002-8df2d36a5afd" (UID: "70187bed-f441-4814-9002-8df2d36a5afd"). InnerVolumeSpecName "builder-dockercfg-526rz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.039104 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70187bed-f441-4814-9002-8df2d36a5afd-kube-api-access-2jws5" (OuterVolumeSpecName: "kube-api-access-2jws5") pod "70187bed-f441-4814-9002-8df2d36a5afd" (UID: "70187bed-f441-4814-9002-8df2d36a5afd"). InnerVolumeSpecName "kube-api-access-2jws5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.046003 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70187bed-f441-4814-9002-8df2d36a5afd-builder-dockercfg-526rz-pull" (OuterVolumeSpecName: "builder-dockercfg-526rz-pull") pod "70187bed-f441-4814-9002-8df2d36a5afd" (UID: "70187bed-f441-4814-9002-8df2d36a5afd"). InnerVolumeSpecName "builder-dockercfg-526rz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.135393 4867 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70187bed-f441-4814-9002-8df2d36a5afd-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.135483 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/70187bed-f441-4814-9002-8df2d36a5afd-builder-dockercfg-526rz-pull\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.135546 4867 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/70187bed-f441-4814-9002-8df2d36a5afd-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.135565 4867 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/70187bed-f441-4814-9002-8df2d36a5afd-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.135584 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/70187bed-f441-4814-9002-8df2d36a5afd-builder-dockercfg-526rz-push\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.135635 4867 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.135653 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jws5\" (UniqueName: \"kubernetes.io/projected/70187bed-f441-4814-9002-8df2d36a5afd-kube-api-access-2jws5\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.135672 4867 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70187bed-f441-4814-9002-8df2d36a5afd-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.135725 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.192962 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "70187bed-f441-4814-9002-8df2d36a5afd" (UID: "70187bed-f441-4814-9002-8df2d36a5afd"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.237620 4867 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.633747 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"70187bed-f441-4814-9002-8df2d36a5afd","Type":"ContainerDied","Data":"1cb8bcff373f0f31387c199e6c34b1cb4566da24f3c07132e427c80e5d0c5a07"} Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.633807 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cb8bcff373f0f31387c199e6c34b1cb4566da24f3c07132e427c80e5d0c5a07" Mar 20 00:24:40 crc kubenswrapper[4867]: I0320 00:24:40.633864 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 20 00:24:42 crc kubenswrapper[4867]: I0320 00:24:42.176219 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "70187bed-f441-4814-9002-8df2d36a5afd" (UID: "70187bed-f441-4814-9002-8df2d36a5afd"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:24:42 crc kubenswrapper[4867]: I0320 00:24:42.264921 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/70187bed-f441-4814-9002-8df2d36a5afd-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.500030 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 20 00:24:44 crc kubenswrapper[4867]: E0320 00:24:44.500570 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c73fdb6-10f4-403a-b900-40f895b971d4" containerName="oc" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.500586 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c73fdb6-10f4-403a-b900-40f895b971d4" containerName="oc" Mar 20 00:24:44 crc kubenswrapper[4867]: E0320 00:24:44.500597 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70187bed-f441-4814-9002-8df2d36a5afd" containerName="manage-dockerfile" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.500609 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="70187bed-f441-4814-9002-8df2d36a5afd" containerName="manage-dockerfile" Mar 20 00:24:44 crc kubenswrapper[4867]: E0320 00:24:44.500621 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70187bed-f441-4814-9002-8df2d36a5afd" containerName="git-clone" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.500629 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="70187bed-f441-4814-9002-8df2d36a5afd" containerName="git-clone" Mar 20 00:24:44 crc kubenswrapper[4867]: E0320 00:24:44.500644 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c114d030-d054-44a6-b10b-1ad432c2f9d7" containerName="registry-server" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.500653 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c114d030-d054-44a6-b10b-1ad432c2f9d7" containerName="registry-server" Mar 20 00:24:44 crc kubenswrapper[4867]: E0320 00:24:44.500665 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c114d030-d054-44a6-b10b-1ad432c2f9d7" containerName="extract-content" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.500673 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c114d030-d054-44a6-b10b-1ad432c2f9d7" containerName="extract-content" Mar 20 00:24:44 crc kubenswrapper[4867]: E0320 00:24:44.500684 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c114d030-d054-44a6-b10b-1ad432c2f9d7" containerName="extract-utilities" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.500692 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c114d030-d054-44a6-b10b-1ad432c2f9d7" containerName="extract-utilities" Mar 20 00:24:44 crc kubenswrapper[4867]: E0320 00:24:44.500702 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70187bed-f441-4814-9002-8df2d36a5afd" containerName="docker-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.500710 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="70187bed-f441-4814-9002-8df2d36a5afd" containerName="docker-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.500838 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="70187bed-f441-4814-9002-8df2d36a5afd" containerName="docker-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.500851 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c73fdb6-10f4-403a-b900-40f895b971d4" containerName="oc" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.500867 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c114d030-d054-44a6-b10b-1ad432c2f9d7" containerName="registry-server" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.501614 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.504504 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-526rz" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.504767 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.506247 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.507366 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.521409 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.598522 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-container-storage-run\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.598860 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-builder-dockercfg-526rz-pull\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.598996 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.599138 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-container-storage-root\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.599262 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-buildworkdir\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.599396 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.599538 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.599720 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtxqn\" (UniqueName: \"kubernetes.io/projected/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-kube-api-access-jtxqn\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.599868 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-system-configs\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.600034 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.600246 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-buildcachedir\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.600406 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-builder-dockercfg-526rz-push\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.701822 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.701889 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtxqn\" (UniqueName: \"kubernetes.io/projected/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-kube-api-access-jtxqn\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.701911 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-system-configs\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.701943 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.701961 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-buildcachedir\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.701977 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-builder-dockercfg-526rz-push\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.701996 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-container-storage-run\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.702013 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-builder-dockercfg-526rz-pull\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.702032 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.702054 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-container-storage-root\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.702071 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-buildworkdir\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.702092 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.702132 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-buildcachedir\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.702297 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.702417 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.702621 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-system-configs\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.703084 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-container-storage-root\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.703232 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-buildworkdir\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.703548 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.703605 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.704225 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-container-storage-run\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.706898 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-builder-dockercfg-526rz-pull\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.706916 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-builder-dockercfg-526rz-push\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.716645 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtxqn\" (UniqueName: \"kubernetes.io/projected/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-kube-api-access-jtxqn\") pod \"sg-core-1-build\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " pod="service-telemetry/sg-core-1-build" Mar 20 00:24:44 crc kubenswrapper[4867]: I0320 00:24:44.818769 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 20 00:24:45 crc kubenswrapper[4867]: I0320 00:24:45.008062 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 20 00:24:45 crc kubenswrapper[4867]: I0320 00:24:45.672208 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b","Type":"ContainerStarted","Data":"c7a387c517191edee922696ec6d6a13bba646417e100e826b2b0ea0e32dccedd"} Mar 20 00:24:46 crc kubenswrapper[4867]: I0320 00:24:46.684480 4867 generic.go:334] "Generic (PLEG): container finished" podID="37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" containerID="c78c404570639d8574e174c3b1a9211614dfd9378da819c41065b6a463e37c84" exitCode=0 Mar 20 00:24:46 crc kubenswrapper[4867]: I0320 00:24:46.684582 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b","Type":"ContainerDied","Data":"c78c404570639d8574e174c3b1a9211614dfd9378da819c41065b6a463e37c84"} Mar 20 00:24:47 crc kubenswrapper[4867]: I0320 00:24:47.694140 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b","Type":"ContainerStarted","Data":"3f63cdce354495cbaf794205f490b46258a98dc6775ae53d100150c677687e6e"} Mar 20 00:24:47 crc kubenswrapper[4867]: I0320 00:24:47.733544 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=3.733465784 podStartE2EDuration="3.733465784s" podCreationTimestamp="2026-03-20 00:24:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:24:47.729161232 +0000 UTC m=+1101.955698809" watchObservedRunningTime="2026-03-20 00:24:47.733465784 +0000 UTC m=+1101.960003311" Mar 20 00:24:48 crc kubenswrapper[4867]: I0320 00:24:48.860188 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:24:48 crc kubenswrapper[4867]: I0320 00:24:48.860631 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:24:54 crc kubenswrapper[4867]: I0320 00:24:54.485367 4867 scope.go:117] "RemoveContainer" containerID="ee398b267881cece77c96a5a97ca5b7c75da4388c86f20e9b04c8d5b29520d5e" Mar 20 00:24:54 crc kubenswrapper[4867]: I0320 00:24:54.797468 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 20 00:24:54 crc kubenswrapper[4867]: I0320 00:24:54.797811 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" containerName="docker-build" containerID="cri-o://3f63cdce354495cbaf794205f490b46258a98dc6775ae53d100150c677687e6e" gracePeriod=30 Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.158730 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b/docker-build/0.log" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.159097 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.245830 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-ca-bundles\") pod \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.245883 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-builder-dockercfg-526rz-push\") pod \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.245910 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-container-storage-run\") pod \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.245948 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtxqn\" (UniqueName: \"kubernetes.io/projected/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-kube-api-access-jtxqn\") pod \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.245975 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-buildworkdir\") pod \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.246014 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-buildcachedir\") pod \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.246041 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-node-pullsecrets\") pod \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.246072 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-system-configs\") pod \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.246095 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-proxy-ca-bundles\") pod \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.246147 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-container-storage-root\") pod \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.246137 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" (UID: "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.246173 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-builder-dockercfg-526rz-pull\") pod \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.246144 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" (UID: "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.246248 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-blob-cache\") pod \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\" (UID: \"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b\") " Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.246535 4867 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.246552 4867 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.246645 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" (UID: "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.247275 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" (UID: "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.247317 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" (UID: "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.249670 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" (UID: "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.251872 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-builder-dockercfg-526rz-pull" (OuterVolumeSpecName: "builder-dockercfg-526rz-pull") pod "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" (UID: "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b"). InnerVolumeSpecName "builder-dockercfg-526rz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.253983 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" (UID: "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.257642 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-kube-api-access-jtxqn" (OuterVolumeSpecName: "kube-api-access-jtxqn") pod "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" (UID: "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b"). InnerVolumeSpecName "kube-api-access-jtxqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.262718 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-builder-dockercfg-526rz-push" (OuterVolumeSpecName: "builder-dockercfg-526rz-push") pod "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" (UID: "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b"). InnerVolumeSpecName "builder-dockercfg-526rz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.347189 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" (UID: "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.347976 4867 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.347997 4867 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.348008 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-builder-dockercfg-526rz-pull\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.348017 4867 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.348026 4867 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.348034 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-builder-dockercfg-526rz-push\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.348042 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.348051 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtxqn\" (UniqueName: \"kubernetes.io/projected/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-kube-api-access-jtxqn\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.348060 4867 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.514099 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" (UID: "37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.549810 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.753415 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b/docker-build/0.log" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.754711 4867 generic.go:334] "Generic (PLEG): container finished" podID="37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" containerID="3f63cdce354495cbaf794205f490b46258a98dc6775ae53d100150c677687e6e" exitCode=1 Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.754782 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b","Type":"ContainerDied","Data":"3f63cdce354495cbaf794205f490b46258a98dc6775ae53d100150c677687e6e"} Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.754828 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b","Type":"ContainerDied","Data":"c7a387c517191edee922696ec6d6a13bba646417e100e826b2b0ea0e32dccedd"} Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.754861 4867 scope.go:117] "RemoveContainer" containerID="3f63cdce354495cbaf794205f490b46258a98dc6775ae53d100150c677687e6e" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.755474 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.805136 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.807788 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.841181 4867 scope.go:117] "RemoveContainer" containerID="c78c404570639d8574e174c3b1a9211614dfd9378da819c41065b6a463e37c84" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.867899 4867 scope.go:117] "RemoveContainer" containerID="3f63cdce354495cbaf794205f490b46258a98dc6775ae53d100150c677687e6e" Mar 20 00:24:55 crc kubenswrapper[4867]: E0320 00:24:55.869320 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f63cdce354495cbaf794205f490b46258a98dc6775ae53d100150c677687e6e\": container with ID starting with 3f63cdce354495cbaf794205f490b46258a98dc6775ae53d100150c677687e6e not found: ID does not exist" containerID="3f63cdce354495cbaf794205f490b46258a98dc6775ae53d100150c677687e6e" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.869384 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f63cdce354495cbaf794205f490b46258a98dc6775ae53d100150c677687e6e"} err="failed to get container status \"3f63cdce354495cbaf794205f490b46258a98dc6775ae53d100150c677687e6e\": rpc error: code = NotFound desc = could not find container \"3f63cdce354495cbaf794205f490b46258a98dc6775ae53d100150c677687e6e\": container with ID starting with 3f63cdce354495cbaf794205f490b46258a98dc6775ae53d100150c677687e6e not found: ID does not exist" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.869422 4867 scope.go:117] "RemoveContainer" containerID="c78c404570639d8574e174c3b1a9211614dfd9378da819c41065b6a463e37c84" Mar 20 00:24:55 crc kubenswrapper[4867]: E0320 00:24:55.870879 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c78c404570639d8574e174c3b1a9211614dfd9378da819c41065b6a463e37c84\": container with ID starting with c78c404570639d8574e174c3b1a9211614dfd9378da819c41065b6a463e37c84 not found: ID does not exist" containerID="c78c404570639d8574e174c3b1a9211614dfd9378da819c41065b6a463e37c84" Mar 20 00:24:55 crc kubenswrapper[4867]: I0320 00:24:55.870928 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c78c404570639d8574e174c3b1a9211614dfd9378da819c41065b6a463e37c84"} err="failed to get container status \"c78c404570639d8574e174c3b1a9211614dfd9378da819c41065b6a463e37c84\": rpc error: code = NotFound desc = could not find container \"c78c404570639d8574e174c3b1a9211614dfd9378da819c41065b6a463e37c84\": container with ID starting with c78c404570639d8574e174c3b1a9211614dfd9378da819c41065b6a463e37c84 not found: ID does not exist" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.376038 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 20 00:24:56 crc kubenswrapper[4867]: E0320 00:24:56.376479 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" containerName="docker-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.376553 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" containerName="docker-build" Mar 20 00:24:56 crc kubenswrapper[4867]: E0320 00:24:56.376576 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" containerName="manage-dockerfile" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.376587 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" containerName="manage-dockerfile" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.376728 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" containerName="docker-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.378598 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.380653 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-526rz" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.381091 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.381475 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.381728 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.401137 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.440305 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b" path="/var/lib/kubelet/pods/37f9df2b-49ee-42aa-bd25-e5aaa2fbe42b/volumes" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.464510 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-buildworkdir\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.464566 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-builder-dockercfg-526rz-push\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.464596 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-container-storage-run\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.464617 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-builder-dockercfg-526rz-pull\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.464638 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.464865 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqhqh\" (UniqueName: \"kubernetes.io/projected/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-kube-api-access-kqhqh\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.465033 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.465119 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.465138 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.465165 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-system-configs\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.465238 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-buildcachedir\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.465328 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-container-storage-root\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.566927 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-system-configs\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.567019 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-buildcachedir\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.567076 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-container-storage-root\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.567135 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-buildworkdir\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.567181 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-builder-dockercfg-526rz-push\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.567229 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-container-storage-run\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.567280 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-builder-dockercfg-526rz-pull\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.567314 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.567386 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqhqh\" (UniqueName: \"kubernetes.io/projected/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-kube-api-access-kqhqh\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.567437 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.567487 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.567566 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.567729 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-container-storage-run\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.568084 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-system-configs\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.568380 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.568390 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.568423 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-buildcachedir\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.568509 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-container-storage-root\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.568693 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-buildworkdir\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.568836 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.570298 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.573370 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-builder-dockercfg-526rz-push\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.573605 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-builder-dockercfg-526rz-pull\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.598937 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqhqh\" (UniqueName: \"kubernetes.io/projected/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-kube-api-access-kqhqh\") pod \"sg-core-2-build\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.743679 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 20 00:24:56 crc kubenswrapper[4867]: I0320 00:24:56.998940 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 20 00:24:57 crc kubenswrapper[4867]: I0320 00:24:57.775907 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa","Type":"ContainerStarted","Data":"d27afeca65a709b8329d39d0bc84e29e5de51cfa146b9039c1879910ac35e3e1"} Mar 20 00:24:57 crc kubenswrapper[4867]: I0320 00:24:57.775976 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa","Type":"ContainerStarted","Data":"8bd64a3fadd5ba6b754f1263e307829d84a7d872738b77626cbe8ed8508a4ffd"} Mar 20 00:24:58 crc kubenswrapper[4867]: I0320 00:24:58.786113 4867 generic.go:334] "Generic (PLEG): container finished" podID="fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" containerID="d27afeca65a709b8329d39d0bc84e29e5de51cfa146b9039c1879910ac35e3e1" exitCode=0 Mar 20 00:24:58 crc kubenswrapper[4867]: I0320 00:24:58.786165 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa","Type":"ContainerDied","Data":"d27afeca65a709b8329d39d0bc84e29e5de51cfa146b9039c1879910ac35e3e1"} Mar 20 00:24:59 crc kubenswrapper[4867]: I0320 00:24:59.797210 4867 generic.go:334] "Generic (PLEG): container finished" podID="fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" containerID="304a79eb999be9499c9f29b57302ceeea1fab146785ed3fbd66862d9557883eb" exitCode=0 Mar 20 00:24:59 crc kubenswrapper[4867]: I0320 00:24:59.797305 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa","Type":"ContainerDied","Data":"304a79eb999be9499c9f29b57302ceeea1fab146785ed3fbd66862d9557883eb"} Mar 20 00:24:59 crc kubenswrapper[4867]: I0320 00:24:59.843040 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa/manage-dockerfile/0.log" Mar 20 00:25:00 crc kubenswrapper[4867]: I0320 00:25:00.812079 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa","Type":"ContainerStarted","Data":"8bf8e1bff425f4196b232d6d73b65012e5e16e6872298aa5db0eafad74d39715"} Mar 20 00:25:18 crc kubenswrapper[4867]: I0320 00:25:18.860780 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:25:18 crc kubenswrapper[4867]: I0320 00:25:18.861224 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:25:18 crc kubenswrapper[4867]: I0320 00:25:18.861270 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:25:18 crc kubenswrapper[4867]: I0320 00:25:18.861876 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"34d1897a4a5c08cebf4ce21f533f5206883f8b21f0e746b2aa5a509e335c6921"} pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 00:25:18 crc kubenswrapper[4867]: I0320 00:25:18.861931 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" containerID="cri-o://34d1897a4a5c08cebf4ce21f533f5206883f8b21f0e746b2aa5a509e335c6921" gracePeriod=600 Mar 20 00:25:19 crc kubenswrapper[4867]: I0320 00:25:19.946598 4867 generic.go:334] "Generic (PLEG): container finished" podID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerID="34d1897a4a5c08cebf4ce21f533f5206883f8b21f0e746b2aa5a509e335c6921" exitCode=0 Mar 20 00:25:19 crc kubenswrapper[4867]: I0320 00:25:19.946982 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" event={"ID":"00eacbd3-d921-414b-8b8d-c4298bdd5a28","Type":"ContainerDied","Data":"34d1897a4a5c08cebf4ce21f533f5206883f8b21f0e746b2aa5a509e335c6921"} Mar 20 00:25:19 crc kubenswrapper[4867]: I0320 00:25:19.947022 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" event={"ID":"00eacbd3-d921-414b-8b8d-c4298bdd5a28","Type":"ContainerStarted","Data":"2d4ae32207d952a9c4b195a19d9b88321c105b63860ad160eeabe54e733e476b"} Mar 20 00:25:19 crc kubenswrapper[4867]: I0320 00:25:19.947049 4867 scope.go:117] "RemoveContainer" containerID="33d423cd234b4249f2b7bca2c14b518d37de21e54afaaa7c2a2edca5bfac00fe" Mar 20 00:25:19 crc kubenswrapper[4867]: I0320 00:25:19.969370 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=23.96935446 podStartE2EDuration="23.96935446s" podCreationTimestamp="2026-03-20 00:24:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:25:00.84882898 +0000 UTC m=+1115.075366547" watchObservedRunningTime="2026-03-20 00:25:19.96935446 +0000 UTC m=+1134.195891967" Mar 20 00:26:00 crc kubenswrapper[4867]: I0320 00:26:00.141129 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566106-bxrm9"] Mar 20 00:26:00 crc kubenswrapper[4867]: I0320 00:26:00.142427 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566106-bxrm9" Mar 20 00:26:00 crc kubenswrapper[4867]: I0320 00:26:00.145272 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 00:26:00 crc kubenswrapper[4867]: I0320 00:26:00.145777 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lxzm5" Mar 20 00:26:00 crc kubenswrapper[4867]: I0320 00:26:00.145837 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 00:26:00 crc kubenswrapper[4867]: I0320 00:26:00.161805 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566106-bxrm9"] Mar 20 00:26:00 crc kubenswrapper[4867]: I0320 00:26:00.164971 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tmcj\" (UniqueName: \"kubernetes.io/projected/3c2dc34e-ac70-4be1-9a54-475a773fa80e-kube-api-access-4tmcj\") pod \"auto-csr-approver-29566106-bxrm9\" (UID: \"3c2dc34e-ac70-4be1-9a54-475a773fa80e\") " pod="openshift-infra/auto-csr-approver-29566106-bxrm9" Mar 20 00:26:00 crc kubenswrapper[4867]: I0320 00:26:00.266146 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tmcj\" (UniqueName: \"kubernetes.io/projected/3c2dc34e-ac70-4be1-9a54-475a773fa80e-kube-api-access-4tmcj\") pod \"auto-csr-approver-29566106-bxrm9\" (UID: \"3c2dc34e-ac70-4be1-9a54-475a773fa80e\") " pod="openshift-infra/auto-csr-approver-29566106-bxrm9" Mar 20 00:26:00 crc kubenswrapper[4867]: I0320 00:26:00.290293 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tmcj\" (UniqueName: \"kubernetes.io/projected/3c2dc34e-ac70-4be1-9a54-475a773fa80e-kube-api-access-4tmcj\") pod \"auto-csr-approver-29566106-bxrm9\" (UID: \"3c2dc34e-ac70-4be1-9a54-475a773fa80e\") " pod="openshift-infra/auto-csr-approver-29566106-bxrm9" Mar 20 00:26:00 crc kubenswrapper[4867]: I0320 00:26:00.460600 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566106-bxrm9" Mar 20 00:26:00 crc kubenswrapper[4867]: I0320 00:26:00.732514 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566106-bxrm9"] Mar 20 00:26:01 crc kubenswrapper[4867]: I0320 00:26:01.230755 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566106-bxrm9" event={"ID":"3c2dc34e-ac70-4be1-9a54-475a773fa80e","Type":"ContainerStarted","Data":"ecc6f727d9ba8cc94e9701f90e0a348c3b23a7def65e2b8435a2fd35518a1315"} Mar 20 00:26:02 crc kubenswrapper[4867]: I0320 00:26:02.238798 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566106-bxrm9" event={"ID":"3c2dc34e-ac70-4be1-9a54-475a773fa80e","Type":"ContainerStarted","Data":"b5fe49a7c6e1b1e2f6a82c5215a467fbd1eece2c10f9f313630fdeaa46ee28a0"} Mar 20 00:26:02 crc kubenswrapper[4867]: I0320 00:26:02.261850 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566106-bxrm9" podStartSLOduration=1.287948165 podStartE2EDuration="2.261827822s" podCreationTimestamp="2026-03-20 00:26:00 +0000 UTC" firstStartedPulling="2026-03-20 00:26:00.742167175 +0000 UTC m=+1174.968704702" lastFinishedPulling="2026-03-20 00:26:01.716046812 +0000 UTC m=+1175.942584359" observedRunningTime="2026-03-20 00:26:02.253529357 +0000 UTC m=+1176.480066894" watchObservedRunningTime="2026-03-20 00:26:02.261827822 +0000 UTC m=+1176.488365339" Mar 20 00:26:03 crc kubenswrapper[4867]: I0320 00:26:03.247331 4867 generic.go:334] "Generic (PLEG): container finished" podID="3c2dc34e-ac70-4be1-9a54-475a773fa80e" containerID="b5fe49a7c6e1b1e2f6a82c5215a467fbd1eece2c10f9f313630fdeaa46ee28a0" exitCode=0 Mar 20 00:26:03 crc kubenswrapper[4867]: I0320 00:26:03.247383 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566106-bxrm9" event={"ID":"3c2dc34e-ac70-4be1-9a54-475a773fa80e","Type":"ContainerDied","Data":"b5fe49a7c6e1b1e2f6a82c5215a467fbd1eece2c10f9f313630fdeaa46ee28a0"} Mar 20 00:26:04 crc kubenswrapper[4867]: I0320 00:26:04.500894 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566106-bxrm9" Mar 20 00:26:04 crc kubenswrapper[4867]: I0320 00:26:04.632846 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tmcj\" (UniqueName: \"kubernetes.io/projected/3c2dc34e-ac70-4be1-9a54-475a773fa80e-kube-api-access-4tmcj\") pod \"3c2dc34e-ac70-4be1-9a54-475a773fa80e\" (UID: \"3c2dc34e-ac70-4be1-9a54-475a773fa80e\") " Mar 20 00:26:04 crc kubenswrapper[4867]: I0320 00:26:04.640686 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c2dc34e-ac70-4be1-9a54-475a773fa80e-kube-api-access-4tmcj" (OuterVolumeSpecName: "kube-api-access-4tmcj") pod "3c2dc34e-ac70-4be1-9a54-475a773fa80e" (UID: "3c2dc34e-ac70-4be1-9a54-475a773fa80e"). InnerVolumeSpecName "kube-api-access-4tmcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:26:04 crc kubenswrapper[4867]: I0320 00:26:04.735472 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tmcj\" (UniqueName: \"kubernetes.io/projected/3c2dc34e-ac70-4be1-9a54-475a773fa80e-kube-api-access-4tmcj\") on node \"crc\" DevicePath \"\"" Mar 20 00:26:05 crc kubenswrapper[4867]: I0320 00:26:05.263612 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566106-bxrm9" event={"ID":"3c2dc34e-ac70-4be1-9a54-475a773fa80e","Type":"ContainerDied","Data":"ecc6f727d9ba8cc94e9701f90e0a348c3b23a7def65e2b8435a2fd35518a1315"} Mar 20 00:26:05 crc kubenswrapper[4867]: I0320 00:26:05.263927 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecc6f727d9ba8cc94e9701f90e0a348c3b23a7def65e2b8435a2fd35518a1315" Mar 20 00:26:05 crc kubenswrapper[4867]: I0320 00:26:05.263647 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566106-bxrm9" Mar 20 00:26:05 crc kubenswrapper[4867]: I0320 00:26:05.325853 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566100-76td9"] Mar 20 00:26:05 crc kubenswrapper[4867]: I0320 00:26:05.334449 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566100-76td9"] Mar 20 00:26:06 crc kubenswrapper[4867]: I0320 00:26:06.429935 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="981c2ac7-a232-4a5c-a491-face5b8e5ad2" path="/var/lib/kubelet/pods/981c2ac7-a232-4a5c-a491-face5b8e5ad2/volumes" Mar 20 00:26:54 crc kubenswrapper[4867]: I0320 00:26:54.590601 4867 scope.go:117] "RemoveContainer" containerID="cdec02430ca48a3d022c151b34fc5df1175068efe41f849e8db5e70b3c070632" Mar 20 00:27:48 crc kubenswrapper[4867]: I0320 00:27:48.860459 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:27:48 crc kubenswrapper[4867]: I0320 00:27:48.861086 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:28:00 crc kubenswrapper[4867]: I0320 00:28:00.139647 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566108-78444"] Mar 20 00:28:00 crc kubenswrapper[4867]: E0320 00:28:00.140479 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c2dc34e-ac70-4be1-9a54-475a773fa80e" containerName="oc" Mar 20 00:28:00 crc kubenswrapper[4867]: I0320 00:28:00.140525 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c2dc34e-ac70-4be1-9a54-475a773fa80e" containerName="oc" Mar 20 00:28:00 crc kubenswrapper[4867]: I0320 00:28:00.140674 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c2dc34e-ac70-4be1-9a54-475a773fa80e" containerName="oc" Mar 20 00:28:00 crc kubenswrapper[4867]: I0320 00:28:00.141245 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566108-78444" Mar 20 00:28:00 crc kubenswrapper[4867]: I0320 00:28:00.144536 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 00:28:00 crc kubenswrapper[4867]: I0320 00:28:00.145061 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 00:28:00 crc kubenswrapper[4867]: I0320 00:28:00.146987 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lxzm5" Mar 20 00:28:00 crc kubenswrapper[4867]: I0320 00:28:00.152571 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566108-78444"] Mar 20 00:28:00 crc kubenswrapper[4867]: I0320 00:28:00.229300 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46zxg\" (UniqueName: \"kubernetes.io/projected/1ee1cbc7-9c0b-446c-8a56-d850d57561b9-kube-api-access-46zxg\") pod \"auto-csr-approver-29566108-78444\" (UID: \"1ee1cbc7-9c0b-446c-8a56-d850d57561b9\") " pod="openshift-infra/auto-csr-approver-29566108-78444" Mar 20 00:28:00 crc kubenswrapper[4867]: I0320 00:28:00.329876 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46zxg\" (UniqueName: \"kubernetes.io/projected/1ee1cbc7-9c0b-446c-8a56-d850d57561b9-kube-api-access-46zxg\") pod \"auto-csr-approver-29566108-78444\" (UID: \"1ee1cbc7-9c0b-446c-8a56-d850d57561b9\") " pod="openshift-infra/auto-csr-approver-29566108-78444" Mar 20 00:28:00 crc kubenswrapper[4867]: I0320 00:28:00.348704 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46zxg\" (UniqueName: \"kubernetes.io/projected/1ee1cbc7-9c0b-446c-8a56-d850d57561b9-kube-api-access-46zxg\") pod \"auto-csr-approver-29566108-78444\" (UID: \"1ee1cbc7-9c0b-446c-8a56-d850d57561b9\") " pod="openshift-infra/auto-csr-approver-29566108-78444" Mar 20 00:28:00 crc kubenswrapper[4867]: I0320 00:28:00.467753 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566108-78444" Mar 20 00:28:00 crc kubenswrapper[4867]: I0320 00:28:00.927316 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566108-78444"] Mar 20 00:28:00 crc kubenswrapper[4867]: I0320 00:28:00.940130 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 00:28:01 crc kubenswrapper[4867]: I0320 00:28:01.088471 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566108-78444" event={"ID":"1ee1cbc7-9c0b-446c-8a56-d850d57561b9","Type":"ContainerStarted","Data":"91ea8f302f79e9ca3b741d85c9a07fee84b8eb4ead16b140b4d5006a37da0bfa"} Mar 20 00:28:09 crc kubenswrapper[4867]: I0320 00:28:09.153814 4867 generic.go:334] "Generic (PLEG): container finished" podID="1ee1cbc7-9c0b-446c-8a56-d850d57561b9" containerID="385e6b6152cb79479396db07f999c6c68101ca0cb80daa78acda438c564a4544" exitCode=0 Mar 20 00:28:09 crc kubenswrapper[4867]: I0320 00:28:09.153916 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566108-78444" event={"ID":"1ee1cbc7-9c0b-446c-8a56-d850d57561b9","Type":"ContainerDied","Data":"385e6b6152cb79479396db07f999c6c68101ca0cb80daa78acda438c564a4544"} Mar 20 00:28:10 crc kubenswrapper[4867]: I0320 00:28:10.377410 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566108-78444" Mar 20 00:28:10 crc kubenswrapper[4867]: I0320 00:28:10.567705 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46zxg\" (UniqueName: \"kubernetes.io/projected/1ee1cbc7-9c0b-446c-8a56-d850d57561b9-kube-api-access-46zxg\") pod \"1ee1cbc7-9c0b-446c-8a56-d850d57561b9\" (UID: \"1ee1cbc7-9c0b-446c-8a56-d850d57561b9\") " Mar 20 00:28:10 crc kubenswrapper[4867]: I0320 00:28:10.578100 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ee1cbc7-9c0b-446c-8a56-d850d57561b9-kube-api-access-46zxg" (OuterVolumeSpecName: "kube-api-access-46zxg") pod "1ee1cbc7-9c0b-446c-8a56-d850d57561b9" (UID: "1ee1cbc7-9c0b-446c-8a56-d850d57561b9"). InnerVolumeSpecName "kube-api-access-46zxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:28:10 crc kubenswrapper[4867]: I0320 00:28:10.669116 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46zxg\" (UniqueName: \"kubernetes.io/projected/1ee1cbc7-9c0b-446c-8a56-d850d57561b9-kube-api-access-46zxg\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:11 crc kubenswrapper[4867]: I0320 00:28:11.172807 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566108-78444" event={"ID":"1ee1cbc7-9c0b-446c-8a56-d850d57561b9","Type":"ContainerDied","Data":"91ea8f302f79e9ca3b741d85c9a07fee84b8eb4ead16b140b4d5006a37da0bfa"} Mar 20 00:28:11 crc kubenswrapper[4867]: I0320 00:28:11.172850 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91ea8f302f79e9ca3b741d85c9a07fee84b8eb4ead16b140b4d5006a37da0bfa" Mar 20 00:28:11 crc kubenswrapper[4867]: I0320 00:28:11.172871 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566108-78444" Mar 20 00:28:11 crc kubenswrapper[4867]: I0320 00:28:11.435841 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566102-s6lwn"] Mar 20 00:28:11 crc kubenswrapper[4867]: I0320 00:28:11.447307 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566102-s6lwn"] Mar 20 00:28:12 crc kubenswrapper[4867]: I0320 00:28:12.431086 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f399014-a31b-4635-8ebf-44180df3444c" path="/var/lib/kubelet/pods/6f399014-a31b-4635-8ebf-44180df3444c/volumes" Mar 20 00:28:18 crc kubenswrapper[4867]: I0320 00:28:18.860154 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:28:18 crc kubenswrapper[4867]: I0320 00:28:18.860799 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:28:20 crc kubenswrapper[4867]: I0320 00:28:20.234326 4867 generic.go:334] "Generic (PLEG): container finished" podID="fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" containerID="8bf8e1bff425f4196b232d6d73b65012e5e16e6872298aa5db0eafad74d39715" exitCode=0 Mar 20 00:28:20 crc kubenswrapper[4867]: I0320 00:28:20.234481 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa","Type":"ContainerDied","Data":"8bf8e1bff425f4196b232d6d73b65012e5e16e6872298aa5db0eafad74d39715"} Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.556378 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.723359 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-container-storage-root\") pod \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.723445 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqhqh\" (UniqueName: \"kubernetes.io/projected/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-kube-api-access-kqhqh\") pod \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.723467 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-node-pullsecrets\") pod \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.723534 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-buildworkdir\") pod \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.723573 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-blob-cache\") pod \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.723605 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-container-storage-run\") pod \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.723634 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-builder-dockercfg-526rz-push\") pod \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.723668 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-proxy-ca-bundles\") pod \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.723714 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-system-configs\") pod \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.723745 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-ca-bundles\") pod \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.723767 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-buildcachedir\") pod \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.723794 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-builder-dockercfg-526rz-pull\") pod \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\" (UID: \"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa\") " Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.723629 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" (UID: "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.724378 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" (UID: "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.724685 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" (UID: "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.725414 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" (UID: "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.725481 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" (UID: "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.725711 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" (UID: "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.733751 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-kube-api-access-kqhqh" (OuterVolumeSpecName: "kube-api-access-kqhqh") pod "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" (UID: "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa"). InnerVolumeSpecName "kube-api-access-kqhqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.733783 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-builder-dockercfg-526rz-push" (OuterVolumeSpecName: "builder-dockercfg-526rz-push") pod "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" (UID: "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa"). InnerVolumeSpecName "builder-dockercfg-526rz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.733852 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-builder-dockercfg-526rz-pull" (OuterVolumeSpecName: "builder-dockercfg-526rz-pull") pod "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" (UID: "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa"). InnerVolumeSpecName "builder-dockercfg-526rz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.745256 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" (UID: "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.824872 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqhqh\" (UniqueName: \"kubernetes.io/projected/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-kube-api-access-kqhqh\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.825155 4867 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.825218 4867 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.825296 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.825350 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-builder-dockercfg-526rz-push\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.825413 4867 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.825473 4867 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.825628 4867 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.825690 4867 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:21 crc kubenswrapper[4867]: I0320 00:28:21.825749 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-builder-dockercfg-526rz-pull\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:22 crc kubenswrapper[4867]: I0320 00:28:22.092548 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" (UID: "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:28:22 crc kubenswrapper[4867]: I0320 00:28:22.129716 4867 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:22 crc kubenswrapper[4867]: I0320 00:28:22.252082 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa","Type":"ContainerDied","Data":"8bd64a3fadd5ba6b754f1263e307829d84a7d872738b77626cbe8ed8508a4ffd"} Mar 20 00:28:22 crc kubenswrapper[4867]: I0320 00:28:22.252154 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bd64a3fadd5ba6b754f1263e307829d84a7d872738b77626cbe8ed8508a4ffd" Mar 20 00:28:22 crc kubenswrapper[4867]: I0320 00:28:22.252259 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 20 00:28:24 crc kubenswrapper[4867]: I0320 00:28:24.401429 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" (UID: "fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:28:24 crc kubenswrapper[4867]: I0320 00:28:24.461805 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.058280 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 20 00:28:26 crc kubenswrapper[4867]: E0320 00:28:26.058929 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" containerName="manage-dockerfile" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.058949 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" containerName="manage-dockerfile" Mar 20 00:28:26 crc kubenswrapper[4867]: E0320 00:28:26.058970 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" containerName="docker-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.058984 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" containerName="docker-build" Mar 20 00:28:26 crc kubenswrapper[4867]: E0320 00:28:26.059013 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" containerName="git-clone" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.059026 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" containerName="git-clone" Mar 20 00:28:26 crc kubenswrapper[4867]: E0320 00:28:26.059051 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee1cbc7-9c0b-446c-8a56-d850d57561b9" containerName="oc" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.059063 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee1cbc7-9c0b-446c-8a56-d850d57561b9" containerName="oc" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.059242 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb6a6fee-8d0b-4c8c-8849-7aa9f540adaa" containerName="docker-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.059268 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee1cbc7-9c0b-446c-8a56-d850d57561b9" containerName="oc" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.060239 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.062448 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.062627 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-526rz" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.062700 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.064437 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.093696 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.187209 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b7611d33-9f73-4d52-ad5c-3b95fccba798-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.187267 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.187457 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.187561 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.187627 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.187680 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.187816 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/b7611d33-9f73-4d52-ad5c-3b95fccba798-builder-dockercfg-526rz-pull\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.187883 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/b7611d33-9f73-4d52-ad5c-3b95fccba798-builder-dockercfg-526rz-push\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.187948 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.187979 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b7611d33-9f73-4d52-ad5c-3b95fccba798-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.188001 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.188034 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4pg8\" (UniqueName: \"kubernetes.io/projected/b7611d33-9f73-4d52-ad5c-3b95fccba798-kube-api-access-x4pg8\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.289267 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.289340 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.289394 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/b7611d33-9f73-4d52-ad5c-3b95fccba798-builder-dockercfg-526rz-pull\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.289452 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/b7611d33-9f73-4d52-ad5c-3b95fccba798-builder-dockercfg-526rz-push\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.289627 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.290096 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.290582 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b7611d33-9f73-4d52-ad5c-3b95fccba798-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.290615 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.290650 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4pg8\" (UniqueName: \"kubernetes.io/projected/b7611d33-9f73-4d52-ad5c-3b95fccba798-kube-api-access-x4pg8\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.290674 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.290681 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b7611d33-9f73-4d52-ad5c-3b95fccba798-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.290726 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b7611d33-9f73-4d52-ad5c-3b95fccba798-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.290709 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b7611d33-9f73-4d52-ad5c-3b95fccba798-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.290794 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.290863 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.290884 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.291186 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.291268 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.291669 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.292176 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.292282 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.296195 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/b7611d33-9f73-4d52-ad5c-3b95fccba798-builder-dockercfg-526rz-push\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.296847 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/b7611d33-9f73-4d52-ad5c-3b95fccba798-builder-dockercfg-526rz-pull\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.319748 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4pg8\" (UniqueName: \"kubernetes.io/projected/b7611d33-9f73-4d52-ad5c-3b95fccba798-kube-api-access-x4pg8\") pod \"sg-bridge-1-build\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.389191 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:26 crc kubenswrapper[4867]: I0320 00:28:26.661939 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 20 00:28:27 crc kubenswrapper[4867]: I0320 00:28:27.302292 4867 generic.go:334] "Generic (PLEG): container finished" podID="b7611d33-9f73-4d52-ad5c-3b95fccba798" containerID="81c7ac2e7cf52140b5c923ad25f0cbff78951baf2c06df6cef1a1ddc5c938c4a" exitCode=0 Mar 20 00:28:27 crc kubenswrapper[4867]: I0320 00:28:27.302396 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"b7611d33-9f73-4d52-ad5c-3b95fccba798","Type":"ContainerDied","Data":"81c7ac2e7cf52140b5c923ad25f0cbff78951baf2c06df6cef1a1ddc5c938c4a"} Mar 20 00:28:27 crc kubenswrapper[4867]: I0320 00:28:27.302684 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"b7611d33-9f73-4d52-ad5c-3b95fccba798","Type":"ContainerStarted","Data":"21565de6547dac9accf08f9401fb1109d9bd1654a1fa237dc25bbc2014239ab2"} Mar 20 00:28:28 crc kubenswrapper[4867]: I0320 00:28:28.309445 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"b7611d33-9f73-4d52-ad5c-3b95fccba798","Type":"ContainerStarted","Data":"f4e7c692e880a3bf275a26451cfb58488797fca1fe5b363939395239cca67959"} Mar 20 00:28:28 crc kubenswrapper[4867]: I0320 00:28:28.334614 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=2.334592882 podStartE2EDuration="2.334592882s" podCreationTimestamp="2026-03-20 00:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:28:28.330035436 +0000 UTC m=+1322.556572963" watchObservedRunningTime="2026-03-20 00:28:28.334592882 +0000 UTC m=+1322.561130409" Mar 20 00:28:34 crc kubenswrapper[4867]: I0320 00:28:34.348726 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_b7611d33-9f73-4d52-ad5c-3b95fccba798/docker-build/0.log" Mar 20 00:28:34 crc kubenswrapper[4867]: I0320 00:28:34.350567 4867 generic.go:334] "Generic (PLEG): container finished" podID="b7611d33-9f73-4d52-ad5c-3b95fccba798" containerID="f4e7c692e880a3bf275a26451cfb58488797fca1fe5b363939395239cca67959" exitCode=1 Mar 20 00:28:34 crc kubenswrapper[4867]: I0320 00:28:34.350621 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"b7611d33-9f73-4d52-ad5c-3b95fccba798","Type":"ContainerDied","Data":"f4e7c692e880a3bf275a26451cfb58488797fca1fe5b363939395239cca67959"} Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.735623 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_b7611d33-9f73-4d52-ad5c-3b95fccba798/docker-build/0.log" Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.736913 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.918725 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-system-configs\") pod \"b7611d33-9f73-4d52-ad5c-3b95fccba798\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.919156 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-buildworkdir\") pod \"b7611d33-9f73-4d52-ad5c-3b95fccba798\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.919185 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b7611d33-9f73-4d52-ad5c-3b95fccba798-buildcachedir\") pod \"b7611d33-9f73-4d52-ad5c-3b95fccba798\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.919285 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/b7611d33-9f73-4d52-ad5c-3b95fccba798-builder-dockercfg-526rz-pull\") pod \"b7611d33-9f73-4d52-ad5c-3b95fccba798\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.919337 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-container-storage-root\") pod \"b7611d33-9f73-4d52-ad5c-3b95fccba798\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.919386 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/b7611d33-9f73-4d52-ad5c-3b95fccba798-builder-dockercfg-526rz-push\") pod \"b7611d33-9f73-4d52-ad5c-3b95fccba798\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.919426 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-container-storage-run\") pod \"b7611d33-9f73-4d52-ad5c-3b95fccba798\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.919463 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-blob-cache\") pod \"b7611d33-9f73-4d52-ad5c-3b95fccba798\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.919426 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7611d33-9f73-4d52-ad5c-3b95fccba798-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "b7611d33-9f73-4d52-ad5c-3b95fccba798" (UID: "b7611d33-9f73-4d52-ad5c-3b95fccba798"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.919511 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-proxy-ca-bundles\") pod \"b7611d33-9f73-4d52-ad5c-3b95fccba798\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.919642 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "b7611d33-9f73-4d52-ad5c-3b95fccba798" (UID: "b7611d33-9f73-4d52-ad5c-3b95fccba798"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.919668 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-ca-bundles\") pod \"b7611d33-9f73-4d52-ad5c-3b95fccba798\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.919725 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b7611d33-9f73-4d52-ad5c-3b95fccba798-node-pullsecrets\") pod \"b7611d33-9f73-4d52-ad5c-3b95fccba798\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.919796 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4pg8\" (UniqueName: \"kubernetes.io/projected/b7611d33-9f73-4d52-ad5c-3b95fccba798-kube-api-access-x4pg8\") pod \"b7611d33-9f73-4d52-ad5c-3b95fccba798\" (UID: \"b7611d33-9f73-4d52-ad5c-3b95fccba798\") " Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.920239 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "b7611d33-9f73-4d52-ad5c-3b95fccba798" (UID: "b7611d33-9f73-4d52-ad5c-3b95fccba798"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.920327 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7611d33-9f73-4d52-ad5c-3b95fccba798-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "b7611d33-9f73-4d52-ad5c-3b95fccba798" (UID: "b7611d33-9f73-4d52-ad5c-3b95fccba798"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.920709 4867 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b7611d33-9f73-4d52-ad5c-3b95fccba798-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.920758 4867 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.920772 4867 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b7611d33-9f73-4d52-ad5c-3b95fccba798-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.920786 4867 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.921340 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "b7611d33-9f73-4d52-ad5c-3b95fccba798" (UID: "b7611d33-9f73-4d52-ad5c-3b95fccba798"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.921629 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "b7611d33-9f73-4d52-ad5c-3b95fccba798" (UID: "b7611d33-9f73-4d52-ad5c-3b95fccba798"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.921753 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "b7611d33-9f73-4d52-ad5c-3b95fccba798" (UID: "b7611d33-9f73-4d52-ad5c-3b95fccba798"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.928973 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7611d33-9f73-4d52-ad5c-3b95fccba798-builder-dockercfg-526rz-pull" (OuterVolumeSpecName: "builder-dockercfg-526rz-pull") pod "b7611d33-9f73-4d52-ad5c-3b95fccba798" (UID: "b7611d33-9f73-4d52-ad5c-3b95fccba798"). InnerVolumeSpecName "builder-dockercfg-526rz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.929038 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7611d33-9f73-4d52-ad5c-3b95fccba798-kube-api-access-x4pg8" (OuterVolumeSpecName: "kube-api-access-x4pg8") pod "b7611d33-9f73-4d52-ad5c-3b95fccba798" (UID: "b7611d33-9f73-4d52-ad5c-3b95fccba798"). InnerVolumeSpecName "kube-api-access-x4pg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:28:35 crc kubenswrapper[4867]: I0320 00:28:35.933655 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7611d33-9f73-4d52-ad5c-3b95fccba798-builder-dockercfg-526rz-push" (OuterVolumeSpecName: "builder-dockercfg-526rz-push") pod "b7611d33-9f73-4d52-ad5c-3b95fccba798" (UID: "b7611d33-9f73-4d52-ad5c-3b95fccba798"). InnerVolumeSpecName "builder-dockercfg-526rz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:28:36 crc kubenswrapper[4867]: I0320 00:28:36.015552 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "b7611d33-9f73-4d52-ad5c-3b95fccba798" (UID: "b7611d33-9f73-4d52-ad5c-3b95fccba798"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:28:36 crc kubenswrapper[4867]: I0320 00:28:36.021943 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/b7611d33-9f73-4d52-ad5c-3b95fccba798-builder-dockercfg-526rz-pull\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:36 crc kubenswrapper[4867]: I0320 00:28:36.021983 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/b7611d33-9f73-4d52-ad5c-3b95fccba798-builder-dockercfg-526rz-push\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:36 crc kubenswrapper[4867]: I0320 00:28:36.021995 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:36 crc kubenswrapper[4867]: I0320 00:28:36.022009 4867 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:36 crc kubenswrapper[4867]: I0320 00:28:36.022021 4867 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:36 crc kubenswrapper[4867]: I0320 00:28:36.022034 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4pg8\" (UniqueName: \"kubernetes.io/projected/b7611d33-9f73-4d52-ad5c-3b95fccba798-kube-api-access-x4pg8\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:36 crc kubenswrapper[4867]: I0320 00:28:36.022045 4867 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b7611d33-9f73-4d52-ad5c-3b95fccba798-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:36 crc kubenswrapper[4867]: I0320 00:28:36.348537 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "b7611d33-9f73-4d52-ad5c-3b95fccba798" (UID: "b7611d33-9f73-4d52-ad5c-3b95fccba798"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:28:36 crc kubenswrapper[4867]: I0320 00:28:36.375405 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_b7611d33-9f73-4d52-ad5c-3b95fccba798/docker-build/0.log" Mar 20 00:28:36 crc kubenswrapper[4867]: I0320 00:28:36.376281 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"b7611d33-9f73-4d52-ad5c-3b95fccba798","Type":"ContainerDied","Data":"21565de6547dac9accf08f9401fb1109d9bd1654a1fa237dc25bbc2014239ab2"} Mar 20 00:28:36 crc kubenswrapper[4867]: I0320 00:28:36.376350 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21565de6547dac9accf08f9401fb1109d9bd1654a1fa237dc25bbc2014239ab2" Mar 20 00:28:36 crc kubenswrapper[4867]: I0320 00:28:36.376549 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 20 00:28:36 crc kubenswrapper[4867]: I0320 00:28:36.426763 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b7611d33-9f73-4d52-ad5c-3b95fccba798-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 20 00:28:36 crc kubenswrapper[4867]: I0320 00:28:36.471425 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 20 00:28:36 crc kubenswrapper[4867]: I0320 00:28:36.477537 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.177204 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 20 00:28:38 crc kubenswrapper[4867]: E0320 00:28:38.177770 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7611d33-9f73-4d52-ad5c-3b95fccba798" containerName="manage-dockerfile" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.177783 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7611d33-9f73-4d52-ad5c-3b95fccba798" containerName="manage-dockerfile" Mar 20 00:28:38 crc kubenswrapper[4867]: E0320 00:28:38.177794 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7611d33-9f73-4d52-ad5c-3b95fccba798" containerName="docker-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.177799 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7611d33-9f73-4d52-ad5c-3b95fccba798" containerName="docker-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.177893 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7611d33-9f73-4d52-ad5c-3b95fccba798" containerName="docker-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.178678 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.181174 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.181360 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-526rz" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.182544 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.184834 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.209088 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.627067 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.627123 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/13a5612e-2498-4465-a8e9-f12d98440b27-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.627146 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13a5612e-2498-4465-a8e9-f12d98440b27-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.627165 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.627184 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78h49\" (UniqueName: \"kubernetes.io/projected/13a5612e-2498-4465-a8e9-f12d98440b27-kube-api-access-78h49\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.627205 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.627220 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.627282 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13a5612e-2498-4465-a8e9-f12d98440b27-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.627299 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/13a5612e-2498-4465-a8e9-f12d98440b27-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.627977 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/13a5612e-2498-4465-a8e9-f12d98440b27-builder-dockercfg-526rz-pull\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.629099 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/13a5612e-2498-4465-a8e9-f12d98440b27-builder-dockercfg-526rz-push\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.629275 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/13a5612e-2498-4465-a8e9-f12d98440b27-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.631054 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7611d33-9f73-4d52-ad5c-3b95fccba798" path="/var/lib/kubelet/pods/b7611d33-9f73-4d52-ad5c-3b95fccba798/volumes" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.731224 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/13a5612e-2498-4465-a8e9-f12d98440b27-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.731310 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13a5612e-2498-4465-a8e9-f12d98440b27-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.731355 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.731389 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78h49\" (UniqueName: \"kubernetes.io/projected/13a5612e-2498-4465-a8e9-f12d98440b27-kube-api-access-78h49\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.731440 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.731472 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.731557 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13a5612e-2498-4465-a8e9-f12d98440b27-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.731590 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/13a5612e-2498-4465-a8e9-f12d98440b27-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.731623 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/13a5612e-2498-4465-a8e9-f12d98440b27-builder-dockercfg-526rz-pull\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.731665 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/13a5612e-2498-4465-a8e9-f12d98440b27-builder-dockercfg-526rz-push\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.731709 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/13a5612e-2498-4465-a8e9-f12d98440b27-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.731761 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.731750 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/13a5612e-2498-4465-a8e9-f12d98440b27-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.732062 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.732064 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/13a5612e-2498-4465-a8e9-f12d98440b27-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.732251 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.732463 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/13a5612e-2498-4465-a8e9-f12d98440b27-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.732679 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13a5612e-2498-4465-a8e9-f12d98440b27-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.732701 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.733096 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.734201 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13a5612e-2498-4465-a8e9-f12d98440b27-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.740150 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/13a5612e-2498-4465-a8e9-f12d98440b27-builder-dockercfg-526rz-push\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.740561 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/13a5612e-2498-4465-a8e9-f12d98440b27-builder-dockercfg-526rz-pull\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.754191 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78h49\" (UniqueName: \"kubernetes.io/projected/13a5612e-2498-4465-a8e9-f12d98440b27-kube-api-access-78h49\") pod \"sg-bridge-2-build\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:38 crc kubenswrapper[4867]: I0320 00:28:38.795926 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 20 00:28:39 crc kubenswrapper[4867]: I0320 00:28:39.059239 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 20 00:28:39 crc kubenswrapper[4867]: I0320 00:28:39.646981 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"13a5612e-2498-4465-a8e9-f12d98440b27","Type":"ContainerStarted","Data":"583f34af41289f6f9816394f4f2d85e29496a3b6a30a3f16a1420a7661529bb0"} Mar 20 00:28:39 crc kubenswrapper[4867]: I0320 00:28:39.647032 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"13a5612e-2498-4465-a8e9-f12d98440b27","Type":"ContainerStarted","Data":"ca90e9a8d13e21791b6731aa8251f6739be8a8dbd134c90e63630cb14327a811"} Mar 20 00:28:40 crc kubenswrapper[4867]: I0320 00:28:40.656161 4867 generic.go:334] "Generic (PLEG): container finished" podID="13a5612e-2498-4465-a8e9-f12d98440b27" containerID="583f34af41289f6f9816394f4f2d85e29496a3b6a30a3f16a1420a7661529bb0" exitCode=0 Mar 20 00:28:40 crc kubenswrapper[4867]: I0320 00:28:40.656230 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"13a5612e-2498-4465-a8e9-f12d98440b27","Type":"ContainerDied","Data":"583f34af41289f6f9816394f4f2d85e29496a3b6a30a3f16a1420a7661529bb0"} Mar 20 00:28:41 crc kubenswrapper[4867]: I0320 00:28:41.697922 4867 generic.go:334] "Generic (PLEG): container finished" podID="13a5612e-2498-4465-a8e9-f12d98440b27" containerID="7fc1ce76e276d6a24b8541869d5e1f31f593316534598fc2d48ab509ec9d1ba0" exitCode=0 Mar 20 00:28:41 crc kubenswrapper[4867]: I0320 00:28:41.698005 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"13a5612e-2498-4465-a8e9-f12d98440b27","Type":"ContainerDied","Data":"7fc1ce76e276d6a24b8541869d5e1f31f593316534598fc2d48ab509ec9d1ba0"} Mar 20 00:28:41 crc kubenswrapper[4867]: I0320 00:28:41.747012 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_13a5612e-2498-4465-a8e9-f12d98440b27/manage-dockerfile/0.log" Mar 20 00:28:42 crc kubenswrapper[4867]: I0320 00:28:42.710738 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"13a5612e-2498-4465-a8e9-f12d98440b27","Type":"ContainerStarted","Data":"ba59409c645ba4470e358a4c282d028edd69be6008d45f38f2921ca41f0ca6d9"} Mar 20 00:28:42 crc kubenswrapper[4867]: I0320 00:28:42.761796 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-2-build" podStartSLOduration=4.7617742 podStartE2EDuration="4.7617742s" podCreationTimestamp="2026-03-20 00:28:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:28:42.756853065 +0000 UTC m=+1336.983390622" watchObservedRunningTime="2026-03-20 00:28:42.7617742 +0000 UTC m=+1336.988311727" Mar 20 00:28:48 crc kubenswrapper[4867]: I0320 00:28:48.860287 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:28:48 crc kubenswrapper[4867]: I0320 00:28:48.860941 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:28:48 crc kubenswrapper[4867]: I0320 00:28:48.861041 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:28:48 crc kubenswrapper[4867]: I0320 00:28:48.861786 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d4ae32207d952a9c4b195a19d9b88321c105b63860ad160eeabe54e733e476b"} pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 00:28:48 crc kubenswrapper[4867]: I0320 00:28:48.861857 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" containerID="cri-o://2d4ae32207d952a9c4b195a19d9b88321c105b63860ad160eeabe54e733e476b" gracePeriod=600 Mar 20 00:28:49 crc kubenswrapper[4867]: I0320 00:28:49.772098 4867 generic.go:334] "Generic (PLEG): container finished" podID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerID="2d4ae32207d952a9c4b195a19d9b88321c105b63860ad160eeabe54e733e476b" exitCode=0 Mar 20 00:28:49 crc kubenswrapper[4867]: I0320 00:28:49.773121 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" event={"ID":"00eacbd3-d921-414b-8b8d-c4298bdd5a28","Type":"ContainerDied","Data":"2d4ae32207d952a9c4b195a19d9b88321c105b63860ad160eeabe54e733e476b"} Mar 20 00:28:49 crc kubenswrapper[4867]: I0320 00:28:49.773322 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" event={"ID":"00eacbd3-d921-414b-8b8d-c4298bdd5a28","Type":"ContainerStarted","Data":"533398062bfbfbe1fa6b73d9da3e04fbe63602429f4dc45fdc1ed10c2dc75279"} Mar 20 00:28:49 crc kubenswrapper[4867]: I0320 00:28:49.773409 4867 scope.go:117] "RemoveContainer" containerID="34d1897a4a5c08cebf4ce21f533f5206883f8b21f0e746b2aa5a509e335c6921" Mar 20 00:28:54 crc kubenswrapper[4867]: I0320 00:28:54.664731 4867 scope.go:117] "RemoveContainer" containerID="17a900a2a06674978edc805b394e10ec18d6fb866835f845221484916811b0c3" Mar 20 00:29:33 crc kubenswrapper[4867]: I0320 00:29:33.085860 4867 generic.go:334] "Generic (PLEG): container finished" podID="13a5612e-2498-4465-a8e9-f12d98440b27" containerID="ba59409c645ba4470e358a4c282d028edd69be6008d45f38f2921ca41f0ca6d9" exitCode=0 Mar 20 00:29:33 crc kubenswrapper[4867]: I0320 00:29:33.085971 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"13a5612e-2498-4465-a8e9-f12d98440b27","Type":"ContainerDied","Data":"ba59409c645ba4470e358a4c282d028edd69be6008d45f38f2921ca41f0ca6d9"} Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.405706 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.540650 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/13a5612e-2498-4465-a8e9-f12d98440b27-builder-dockercfg-526rz-push\") pod \"13a5612e-2498-4465-a8e9-f12d98440b27\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.540695 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/13a5612e-2498-4465-a8e9-f12d98440b27-node-pullsecrets\") pod \"13a5612e-2498-4465-a8e9-f12d98440b27\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.540717 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-buildworkdir\") pod \"13a5612e-2498-4465-a8e9-f12d98440b27\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.540744 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13a5612e-2498-4465-a8e9-f12d98440b27-build-proxy-ca-bundles\") pod \"13a5612e-2498-4465-a8e9-f12d98440b27\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.540768 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-build-blob-cache\") pod \"13a5612e-2498-4465-a8e9-f12d98440b27\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.540787 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/13a5612e-2498-4465-a8e9-f12d98440b27-builder-dockercfg-526rz-pull\") pod \"13a5612e-2498-4465-a8e9-f12d98440b27\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.540826 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/13a5612e-2498-4465-a8e9-f12d98440b27-build-system-configs\") pod \"13a5612e-2498-4465-a8e9-f12d98440b27\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.540806 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13a5612e-2498-4465-a8e9-f12d98440b27-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "13a5612e-2498-4465-a8e9-f12d98440b27" (UID: "13a5612e-2498-4465-a8e9-f12d98440b27"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.540859 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78h49\" (UniqueName: \"kubernetes.io/projected/13a5612e-2498-4465-a8e9-f12d98440b27-kube-api-access-78h49\") pod \"13a5612e-2498-4465-a8e9-f12d98440b27\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.540875 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/13a5612e-2498-4465-a8e9-f12d98440b27-buildcachedir\") pod \"13a5612e-2498-4465-a8e9-f12d98440b27\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.540903 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13a5612e-2498-4465-a8e9-f12d98440b27-build-ca-bundles\") pod \"13a5612e-2498-4465-a8e9-f12d98440b27\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.540919 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-container-storage-root\") pod \"13a5612e-2498-4465-a8e9-f12d98440b27\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.540938 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-container-storage-run\") pod \"13a5612e-2498-4465-a8e9-f12d98440b27\" (UID: \"13a5612e-2498-4465-a8e9-f12d98440b27\") " Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.541164 4867 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/13a5612e-2498-4465-a8e9-f12d98440b27-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.541956 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a5612e-2498-4465-a8e9-f12d98440b27-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "13a5612e-2498-4465-a8e9-f12d98440b27" (UID: "13a5612e-2498-4465-a8e9-f12d98440b27"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.542027 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13a5612e-2498-4465-a8e9-f12d98440b27-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "13a5612e-2498-4465-a8e9-f12d98440b27" (UID: "13a5612e-2498-4465-a8e9-f12d98440b27"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.542111 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a5612e-2498-4465-a8e9-f12d98440b27-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "13a5612e-2498-4465-a8e9-f12d98440b27" (UID: "13a5612e-2498-4465-a8e9-f12d98440b27"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.542126 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a5612e-2498-4465-a8e9-f12d98440b27-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "13a5612e-2498-4465-a8e9-f12d98440b27" (UID: "13a5612e-2498-4465-a8e9-f12d98440b27"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.542373 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "13a5612e-2498-4465-a8e9-f12d98440b27" (UID: "13a5612e-2498-4465-a8e9-f12d98440b27"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.544089 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "13a5612e-2498-4465-a8e9-f12d98440b27" (UID: "13a5612e-2498-4465-a8e9-f12d98440b27"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.548098 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a5612e-2498-4465-a8e9-f12d98440b27-builder-dockercfg-526rz-pull" (OuterVolumeSpecName: "builder-dockercfg-526rz-pull") pod "13a5612e-2498-4465-a8e9-f12d98440b27" (UID: "13a5612e-2498-4465-a8e9-f12d98440b27"). InnerVolumeSpecName "builder-dockercfg-526rz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.548156 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a5612e-2498-4465-a8e9-f12d98440b27-kube-api-access-78h49" (OuterVolumeSpecName: "kube-api-access-78h49") pod "13a5612e-2498-4465-a8e9-f12d98440b27" (UID: "13a5612e-2498-4465-a8e9-f12d98440b27"). InnerVolumeSpecName "kube-api-access-78h49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.553642 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a5612e-2498-4465-a8e9-f12d98440b27-builder-dockercfg-526rz-push" (OuterVolumeSpecName: "builder-dockercfg-526rz-push") pod "13a5612e-2498-4465-a8e9-f12d98440b27" (UID: "13a5612e-2498-4465-a8e9-f12d98440b27"). InnerVolumeSpecName "builder-dockercfg-526rz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.642443 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/13a5612e-2498-4465-a8e9-f12d98440b27-builder-dockercfg-526rz-push\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.642511 4867 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.642526 4867 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13a5612e-2498-4465-a8e9-f12d98440b27-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.642539 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/13a5612e-2498-4465-a8e9-f12d98440b27-builder-dockercfg-526rz-pull\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.642551 4867 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/13a5612e-2498-4465-a8e9-f12d98440b27-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.642563 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78h49\" (UniqueName: \"kubernetes.io/projected/13a5612e-2498-4465-a8e9-f12d98440b27-kube-api-access-78h49\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.642574 4867 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/13a5612e-2498-4465-a8e9-f12d98440b27-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.642586 4867 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13a5612e-2498-4465-a8e9-f12d98440b27-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.642596 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.669117 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "13a5612e-2498-4465-a8e9-f12d98440b27" (UID: "13a5612e-2498-4465-a8e9-f12d98440b27"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:29:34 crc kubenswrapper[4867]: I0320 00:29:34.743939 4867 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:35 crc kubenswrapper[4867]: I0320 00:29:35.104359 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"13a5612e-2498-4465-a8e9-f12d98440b27","Type":"ContainerDied","Data":"ca90e9a8d13e21791b6731aa8251f6739be8a8dbd134c90e63630cb14327a811"} Mar 20 00:29:35 crc kubenswrapper[4867]: I0320 00:29:35.104393 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca90e9a8d13e21791b6731aa8251f6739be8a8dbd134c90e63630cb14327a811" Mar 20 00:29:35 crc kubenswrapper[4867]: I0320 00:29:35.104461 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 20 00:29:35 crc kubenswrapper[4867]: I0320 00:29:35.228176 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "13a5612e-2498-4465-a8e9-f12d98440b27" (UID: "13a5612e-2498-4465-a8e9-f12d98440b27"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:29:35 crc kubenswrapper[4867]: I0320 00:29:35.251725 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/13a5612e-2498-4465-a8e9-f12d98440b27-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.637330 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 20 00:29:38 crc kubenswrapper[4867]: E0320 00:29:38.638104 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a5612e-2498-4465-a8e9-f12d98440b27" containerName="git-clone" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.638127 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a5612e-2498-4465-a8e9-f12d98440b27" containerName="git-clone" Mar 20 00:29:38 crc kubenswrapper[4867]: E0320 00:29:38.638150 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a5612e-2498-4465-a8e9-f12d98440b27" containerName="docker-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.638163 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a5612e-2498-4465-a8e9-f12d98440b27" containerName="docker-build" Mar 20 00:29:38 crc kubenswrapper[4867]: E0320 00:29:38.638201 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a5612e-2498-4465-a8e9-f12d98440b27" containerName="manage-dockerfile" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.638217 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a5612e-2498-4465-a8e9-f12d98440b27" containerName="manage-dockerfile" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.638454 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a5612e-2498-4465-a8e9-f12d98440b27" containerName="docker-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.639635 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.644104 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.644110 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-526rz" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.644600 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.645589 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.645951 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.702023 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.702076 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.702109 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-builder-dockercfg-526rz-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.702130 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-builder-dockercfg-526rz-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.702149 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.702166 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.702180 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bhl4\" (UniqueName: \"kubernetes.io/projected/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-kube-api-access-4bhl4\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.702200 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.702275 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.702380 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.702400 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.702434 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.804016 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.804060 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.804084 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.804127 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.804167 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.804207 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-builder-dockercfg-526rz-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.804235 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-builder-dockercfg-526rz-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.804255 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.804276 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.804295 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bhl4\" (UniqueName: \"kubernetes.io/projected/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-kube-api-access-4bhl4\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.804323 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.804352 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.804429 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.805118 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.805187 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.805344 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.805394 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.805534 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.805876 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.805918 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.805942 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.810255 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-builder-dockercfg-526rz-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.810407 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-builder-dockercfg-526rz-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.832794 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bhl4\" (UniqueName: \"kubernetes.io/projected/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-kube-api-access-4bhl4\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:38 crc kubenswrapper[4867]: I0320 00:29:38.968083 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:39 crc kubenswrapper[4867]: I0320 00:29:39.536573 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 20 00:29:40 crc kubenswrapper[4867]: I0320 00:29:40.154306 4867 generic.go:334] "Generic (PLEG): container finished" podID="4ec22d3e-fdb7-4298-ba89-bd8b9781300b" containerID="5cdec13de6f64319ea587e998115909f19158dc83eae48d1b204600c3b8054a9" exitCode=0 Mar 20 00:29:40 crc kubenswrapper[4867]: I0320 00:29:40.154539 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"4ec22d3e-fdb7-4298-ba89-bd8b9781300b","Type":"ContainerDied","Data":"5cdec13de6f64319ea587e998115909f19158dc83eae48d1b204600c3b8054a9"} Mar 20 00:29:40 crc kubenswrapper[4867]: I0320 00:29:40.154778 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"4ec22d3e-fdb7-4298-ba89-bd8b9781300b","Type":"ContainerStarted","Data":"e027a5b9ba7b0a42f89252e009f0c2c210258036b72cf0a442decda736647fbb"} Mar 20 00:29:41 crc kubenswrapper[4867]: I0320 00:29:41.164305 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"4ec22d3e-fdb7-4298-ba89-bd8b9781300b","Type":"ContainerStarted","Data":"097da27b024ba6d85d6a44cb26fe511e23a249a54d97ccff2d57f5684278ee63"} Mar 20 00:29:41 crc kubenswrapper[4867]: I0320 00:29:41.185921 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=3.185893956 podStartE2EDuration="3.185893956s" podCreationTimestamp="2026-03-20 00:29:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:29:41.184819359 +0000 UTC m=+1395.411356876" watchObservedRunningTime="2026-03-20 00:29:41.185893956 +0000 UTC m=+1395.412431473" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.207127 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.208218 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="4ec22d3e-fdb7-4298-ba89-bd8b9781300b" containerName="docker-build" containerID="cri-o://097da27b024ba6d85d6a44cb26fe511e23a249a54d97ccff2d57f5684278ee63" gracePeriod=30 Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.562734 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_4ec22d3e-fdb7-4298-ba89-bd8b9781300b/docker-build/0.log" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.563116 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.636738 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-proxy-ca-bundles\") pod \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.636790 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-container-storage-root\") pod \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.636823 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-container-storage-run\") pod \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.636848 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-buildworkdir\") pod \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.636870 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-node-pullsecrets\") pod \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.636898 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-builder-dockercfg-526rz-push\") pod \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.636935 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-ca-bundles\") pod \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.636960 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-blob-cache\") pod \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.636987 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-system-configs\") pod \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.637058 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-builder-dockercfg-526rz-pull\") pod \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.637113 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bhl4\" (UniqueName: \"kubernetes.io/projected/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-kube-api-access-4bhl4\") pod \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.637133 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-buildcachedir\") pod \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\" (UID: \"4ec22d3e-fdb7-4298-ba89-bd8b9781300b\") " Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.637437 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "4ec22d3e-fdb7-4298-ba89-bd8b9781300b" (UID: "4ec22d3e-fdb7-4298-ba89-bd8b9781300b"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.637899 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "4ec22d3e-fdb7-4298-ba89-bd8b9781300b" (UID: "4ec22d3e-fdb7-4298-ba89-bd8b9781300b"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.638553 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "4ec22d3e-fdb7-4298-ba89-bd8b9781300b" (UID: "4ec22d3e-fdb7-4298-ba89-bd8b9781300b"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.638570 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "4ec22d3e-fdb7-4298-ba89-bd8b9781300b" (UID: "4ec22d3e-fdb7-4298-ba89-bd8b9781300b"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.638657 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "4ec22d3e-fdb7-4298-ba89-bd8b9781300b" (UID: "4ec22d3e-fdb7-4298-ba89-bd8b9781300b"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.638758 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "4ec22d3e-fdb7-4298-ba89-bd8b9781300b" (UID: "4ec22d3e-fdb7-4298-ba89-bd8b9781300b"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.638958 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "4ec22d3e-fdb7-4298-ba89-bd8b9781300b" (UID: "4ec22d3e-fdb7-4298-ba89-bd8b9781300b"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.646273 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-builder-dockercfg-526rz-pull" (OuterVolumeSpecName: "builder-dockercfg-526rz-pull") pod "4ec22d3e-fdb7-4298-ba89-bd8b9781300b" (UID: "4ec22d3e-fdb7-4298-ba89-bd8b9781300b"). InnerVolumeSpecName "builder-dockercfg-526rz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.646306 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-builder-dockercfg-526rz-push" (OuterVolumeSpecName: "builder-dockercfg-526rz-push") pod "4ec22d3e-fdb7-4298-ba89-bd8b9781300b" (UID: "4ec22d3e-fdb7-4298-ba89-bd8b9781300b"). InnerVolumeSpecName "builder-dockercfg-526rz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.657961 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-kube-api-access-4bhl4" (OuterVolumeSpecName: "kube-api-access-4bhl4") pod "4ec22d3e-fdb7-4298-ba89-bd8b9781300b" (UID: "4ec22d3e-fdb7-4298-ba89-bd8b9781300b"). InnerVolumeSpecName "kube-api-access-4bhl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.715203 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "4ec22d3e-fdb7-4298-ba89-bd8b9781300b" (UID: "4ec22d3e-fdb7-4298-ba89-bd8b9781300b"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.738548 4867 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.738588 4867 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.738597 4867 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.738607 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-builder-dockercfg-526rz-pull\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.738617 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bhl4\" (UniqueName: \"kubernetes.io/projected/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-kube-api-access-4bhl4\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.738628 4867 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.738636 4867 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.738644 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.738654 4867 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.738661 4867 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.738669 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-builder-dockercfg-526rz-push\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:49 crc kubenswrapper[4867]: I0320 00:29:49.996191 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "4ec22d3e-fdb7-4298-ba89-bd8b9781300b" (UID: "4ec22d3e-fdb7-4298-ba89-bd8b9781300b"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.042189 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4ec22d3e-fdb7-4298-ba89-bd8b9781300b-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.242703 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_4ec22d3e-fdb7-4298-ba89-bd8b9781300b/docker-build/0.log" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.243381 4867 generic.go:334] "Generic (PLEG): container finished" podID="4ec22d3e-fdb7-4298-ba89-bd8b9781300b" containerID="097da27b024ba6d85d6a44cb26fe511e23a249a54d97ccff2d57f5684278ee63" exitCode=1 Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.243432 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"4ec22d3e-fdb7-4298-ba89-bd8b9781300b","Type":"ContainerDied","Data":"097da27b024ba6d85d6a44cb26fe511e23a249a54d97ccff2d57f5684278ee63"} Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.243472 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"4ec22d3e-fdb7-4298-ba89-bd8b9781300b","Type":"ContainerDied","Data":"e027a5b9ba7b0a42f89252e009f0c2c210258036b72cf0a442decda736647fbb"} Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.243542 4867 scope.go:117] "RemoveContainer" containerID="097da27b024ba6d85d6a44cb26fe511e23a249a54d97ccff2d57f5684278ee63" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.243584 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.276704 4867 scope.go:117] "RemoveContainer" containerID="5cdec13de6f64319ea587e998115909f19158dc83eae48d1b204600c3b8054a9" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.293730 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.296786 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.313977 4867 scope.go:117] "RemoveContainer" containerID="097da27b024ba6d85d6a44cb26fe511e23a249a54d97ccff2d57f5684278ee63" Mar 20 00:29:50 crc kubenswrapper[4867]: E0320 00:29:50.314607 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"097da27b024ba6d85d6a44cb26fe511e23a249a54d97ccff2d57f5684278ee63\": container with ID starting with 097da27b024ba6d85d6a44cb26fe511e23a249a54d97ccff2d57f5684278ee63 not found: ID does not exist" containerID="097da27b024ba6d85d6a44cb26fe511e23a249a54d97ccff2d57f5684278ee63" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.314651 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"097da27b024ba6d85d6a44cb26fe511e23a249a54d97ccff2d57f5684278ee63"} err="failed to get container status \"097da27b024ba6d85d6a44cb26fe511e23a249a54d97ccff2d57f5684278ee63\": rpc error: code = NotFound desc = could not find container \"097da27b024ba6d85d6a44cb26fe511e23a249a54d97ccff2d57f5684278ee63\": container with ID starting with 097da27b024ba6d85d6a44cb26fe511e23a249a54d97ccff2d57f5684278ee63 not found: ID does not exist" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.314678 4867 scope.go:117] "RemoveContainer" containerID="5cdec13de6f64319ea587e998115909f19158dc83eae48d1b204600c3b8054a9" Mar 20 00:29:50 crc kubenswrapper[4867]: E0320 00:29:50.315127 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cdec13de6f64319ea587e998115909f19158dc83eae48d1b204600c3b8054a9\": container with ID starting with 5cdec13de6f64319ea587e998115909f19158dc83eae48d1b204600c3b8054a9 not found: ID does not exist" containerID="5cdec13de6f64319ea587e998115909f19158dc83eae48d1b204600c3b8054a9" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.315165 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cdec13de6f64319ea587e998115909f19158dc83eae48d1b204600c3b8054a9"} err="failed to get container status \"5cdec13de6f64319ea587e998115909f19158dc83eae48d1b204600c3b8054a9\": rpc error: code = NotFound desc = could not find container \"5cdec13de6f64319ea587e998115909f19158dc83eae48d1b204600c3b8054a9\": container with ID starting with 5cdec13de6f64319ea587e998115909f19158dc83eae48d1b204600c3b8054a9 not found: ID does not exist" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.435526 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ec22d3e-fdb7-4298-ba89-bd8b9781300b" path="/var/lib/kubelet/pods/4ec22d3e-fdb7-4298-ba89-bd8b9781300b/volumes" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.941657 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 20 00:29:50 crc kubenswrapper[4867]: E0320 00:29:50.941928 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ec22d3e-fdb7-4298-ba89-bd8b9781300b" containerName="manage-dockerfile" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.941951 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec22d3e-fdb7-4298-ba89-bd8b9781300b" containerName="manage-dockerfile" Mar 20 00:29:50 crc kubenswrapper[4867]: E0320 00:29:50.941963 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ec22d3e-fdb7-4298-ba89-bd8b9781300b" containerName="docker-build" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.941972 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec22d3e-fdb7-4298-ba89-bd8b9781300b" containerName="docker-build" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.942118 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ec22d3e-fdb7-4298-ba89-bd8b9781300b" containerName="docker-build" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.943137 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.946146 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.946845 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.947039 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.947101 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-526rz" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.954099 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.954152 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.954183 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.954215 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.954240 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.954264 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.954285 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfz79\" (UniqueName: \"kubernetes.io/projected/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-kube-api-access-kfz79\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.954324 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.954351 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-builder-dockercfg-526rz-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.954377 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-builder-dockercfg-526rz-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.954429 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.954459 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:50 crc kubenswrapper[4867]: I0320 00:29:50.963749 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.055537 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.055576 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.055596 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.055612 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfz79\" (UniqueName: \"kubernetes.io/projected/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-kube-api-access-kfz79\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.055643 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.055661 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-builder-dockercfg-526rz-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.055682 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-builder-dockercfg-526rz-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.055721 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.055745 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.055766 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.055784 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.055800 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.056010 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.056066 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.056689 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.056755 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.056792 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.057663 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.057667 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.057723 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.057767 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.059850 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-builder-dockercfg-526rz-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.061218 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-builder-dockercfg-526rz-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.074345 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfz79\" (UniqueName: \"kubernetes.io/projected/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-kube-api-access-kfz79\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.263054 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:29:51 crc kubenswrapper[4867]: I0320 00:29:51.496149 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 20 00:29:52 crc kubenswrapper[4867]: I0320 00:29:52.262700 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"c17a156c-a61a-4abe-a8c9-e4915b98f3f9","Type":"ContainerStarted","Data":"6664c1cf47db6fdb8f89ee6ccf17e831f07a2dd1c7931e8aab53445b82e92709"} Mar 20 00:29:52 crc kubenswrapper[4867]: I0320 00:29:52.263091 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"c17a156c-a61a-4abe-a8c9-e4915b98f3f9","Type":"ContainerStarted","Data":"b3ad0174fc7537d50adaf1533d38a2088bd39936cecc35981f49c81fbe8618e8"} Mar 20 00:29:52 crc kubenswrapper[4867]: E0320 00:29:52.418256 4867 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.230:48328->38.102.83.230:33857: write tcp 38.102.83.230:48328->38.102.83.230:33857: write: connection reset by peer Mar 20 00:29:53 crc kubenswrapper[4867]: I0320 00:29:53.274592 4867 generic.go:334] "Generic (PLEG): container finished" podID="c17a156c-a61a-4abe-a8c9-e4915b98f3f9" containerID="6664c1cf47db6fdb8f89ee6ccf17e831f07a2dd1c7931e8aab53445b82e92709" exitCode=0 Mar 20 00:29:53 crc kubenswrapper[4867]: I0320 00:29:53.274654 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"c17a156c-a61a-4abe-a8c9-e4915b98f3f9","Type":"ContainerDied","Data":"6664c1cf47db6fdb8f89ee6ccf17e831f07a2dd1c7931e8aab53445b82e92709"} Mar 20 00:29:54 crc kubenswrapper[4867]: I0320 00:29:54.287056 4867 generic.go:334] "Generic (PLEG): container finished" podID="c17a156c-a61a-4abe-a8c9-e4915b98f3f9" containerID="c3dbcc766dc0fe1e755a23566457c1b1695106aeef624306e3c1cad33fbd4771" exitCode=0 Mar 20 00:29:54 crc kubenswrapper[4867]: I0320 00:29:54.287102 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"c17a156c-a61a-4abe-a8c9-e4915b98f3f9","Type":"ContainerDied","Data":"c3dbcc766dc0fe1e755a23566457c1b1695106aeef624306e3c1cad33fbd4771"} Mar 20 00:29:54 crc kubenswrapper[4867]: I0320 00:29:54.362909 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_c17a156c-a61a-4abe-a8c9-e4915b98f3f9/manage-dockerfile/0.log" Mar 20 00:29:55 crc kubenswrapper[4867]: I0320 00:29:55.299466 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"c17a156c-a61a-4abe-a8c9-e4915b98f3f9","Type":"ContainerStarted","Data":"4e51dc4a5cd97d5bb1cbbec85028a8a15c477a5d579182e1289aed2810747361"} Mar 20 00:29:55 crc kubenswrapper[4867]: I0320 00:29:55.352037 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=5.35201979 podStartE2EDuration="5.35201979s" podCreationTimestamp="2026-03-20 00:29:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:29:55.346888507 +0000 UTC m=+1409.573426064" watchObservedRunningTime="2026-03-20 00:29:55.35201979 +0000 UTC m=+1409.578557307" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.134795 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566110-wcb2q"] Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.136261 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566110-wcb2q" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.138309 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.138426 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lxzm5" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.138573 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.140275 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd"] Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.141240 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.142955 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.143008 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.145432 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566110-wcb2q"] Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.155816 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd"] Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.193390 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23fa5f63-fbfa-42f5-87e6-e491a41e40b5-secret-volume\") pod \"collect-profiles-29566110-m8vbd\" (UID: \"23fa5f63-fbfa-42f5-87e6-e491a41e40b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.193451 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2xm4\" (UniqueName: \"kubernetes.io/projected/3874203e-db49-4df9-ad3e-2064bc94dfd3-kube-api-access-g2xm4\") pod \"auto-csr-approver-29566110-wcb2q\" (UID: \"3874203e-db49-4df9-ad3e-2064bc94dfd3\") " pod="openshift-infra/auto-csr-approver-29566110-wcb2q" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.193508 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23fa5f63-fbfa-42f5-87e6-e491a41e40b5-config-volume\") pod \"collect-profiles-29566110-m8vbd\" (UID: \"23fa5f63-fbfa-42f5-87e6-e491a41e40b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.193547 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw2ck\" (UniqueName: \"kubernetes.io/projected/23fa5f63-fbfa-42f5-87e6-e491a41e40b5-kube-api-access-cw2ck\") pod \"collect-profiles-29566110-m8vbd\" (UID: \"23fa5f63-fbfa-42f5-87e6-e491a41e40b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.294959 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23fa5f63-fbfa-42f5-87e6-e491a41e40b5-secret-volume\") pod \"collect-profiles-29566110-m8vbd\" (UID: \"23fa5f63-fbfa-42f5-87e6-e491a41e40b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.295038 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2xm4\" (UniqueName: \"kubernetes.io/projected/3874203e-db49-4df9-ad3e-2064bc94dfd3-kube-api-access-g2xm4\") pod \"auto-csr-approver-29566110-wcb2q\" (UID: \"3874203e-db49-4df9-ad3e-2064bc94dfd3\") " pod="openshift-infra/auto-csr-approver-29566110-wcb2q" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.295097 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23fa5f63-fbfa-42f5-87e6-e491a41e40b5-config-volume\") pod \"collect-profiles-29566110-m8vbd\" (UID: \"23fa5f63-fbfa-42f5-87e6-e491a41e40b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.295147 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw2ck\" (UniqueName: \"kubernetes.io/projected/23fa5f63-fbfa-42f5-87e6-e491a41e40b5-kube-api-access-cw2ck\") pod \"collect-profiles-29566110-m8vbd\" (UID: \"23fa5f63-fbfa-42f5-87e6-e491a41e40b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.296266 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23fa5f63-fbfa-42f5-87e6-e491a41e40b5-config-volume\") pod \"collect-profiles-29566110-m8vbd\" (UID: \"23fa5f63-fbfa-42f5-87e6-e491a41e40b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.304837 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23fa5f63-fbfa-42f5-87e6-e491a41e40b5-secret-volume\") pod \"collect-profiles-29566110-m8vbd\" (UID: \"23fa5f63-fbfa-42f5-87e6-e491a41e40b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.310986 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2xm4\" (UniqueName: \"kubernetes.io/projected/3874203e-db49-4df9-ad3e-2064bc94dfd3-kube-api-access-g2xm4\") pod \"auto-csr-approver-29566110-wcb2q\" (UID: \"3874203e-db49-4df9-ad3e-2064bc94dfd3\") " pod="openshift-infra/auto-csr-approver-29566110-wcb2q" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.331665 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw2ck\" (UniqueName: \"kubernetes.io/projected/23fa5f63-fbfa-42f5-87e6-e491a41e40b5-kube-api-access-cw2ck\") pod \"collect-profiles-29566110-m8vbd\" (UID: \"23fa5f63-fbfa-42f5-87e6-e491a41e40b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.468690 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566110-wcb2q" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.481148 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd" Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.686332 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd"] Mar 20 00:30:00 crc kubenswrapper[4867]: I0320 00:30:00.745993 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566110-wcb2q"] Mar 20 00:30:00 crc kubenswrapper[4867]: W0320 00:30:00.748097 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3874203e_db49_4df9_ad3e_2064bc94dfd3.slice/crio-f821fecb2f68c571178a7513f96ec43370165448685b26f424edaf948a509c3f WatchSource:0}: Error finding container f821fecb2f68c571178a7513f96ec43370165448685b26f424edaf948a509c3f: Status 404 returned error can't find the container with id f821fecb2f68c571178a7513f96ec43370165448685b26f424edaf948a509c3f Mar 20 00:30:01 crc kubenswrapper[4867]: I0320 00:30:01.340166 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566110-wcb2q" event={"ID":"3874203e-db49-4df9-ad3e-2064bc94dfd3","Type":"ContainerStarted","Data":"f821fecb2f68c571178a7513f96ec43370165448685b26f424edaf948a509c3f"} Mar 20 00:30:01 crc kubenswrapper[4867]: I0320 00:30:01.341719 4867 generic.go:334] "Generic (PLEG): container finished" podID="23fa5f63-fbfa-42f5-87e6-e491a41e40b5" containerID="3b0267be3dc6ed09410c1efee8fac4fcd2c5711fa3dcf2975f1e3d909b7eda21" exitCode=0 Mar 20 00:30:01 crc kubenswrapper[4867]: I0320 00:30:01.341745 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd" event={"ID":"23fa5f63-fbfa-42f5-87e6-e491a41e40b5","Type":"ContainerDied","Data":"3b0267be3dc6ed09410c1efee8fac4fcd2c5711fa3dcf2975f1e3d909b7eda21"} Mar 20 00:30:01 crc kubenswrapper[4867]: I0320 00:30:01.341761 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd" event={"ID":"23fa5f63-fbfa-42f5-87e6-e491a41e40b5","Type":"ContainerStarted","Data":"4794b356511d636b6f807d640444a8dba65a154c223d6933a4cd0eddb400bc6f"} Mar 20 00:30:02 crc kubenswrapper[4867]: I0320 00:30:02.354017 4867 generic.go:334] "Generic (PLEG): container finished" podID="3874203e-db49-4df9-ad3e-2064bc94dfd3" containerID="174ee34b0906f9f3e1de47e2f3826e69e611717537339e3cdc1605e95fc84925" exitCode=0 Mar 20 00:30:02 crc kubenswrapper[4867]: I0320 00:30:02.354128 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566110-wcb2q" event={"ID":"3874203e-db49-4df9-ad3e-2064bc94dfd3","Type":"ContainerDied","Data":"174ee34b0906f9f3e1de47e2f3826e69e611717537339e3cdc1605e95fc84925"} Mar 20 00:30:02 crc kubenswrapper[4867]: I0320 00:30:02.600833 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd" Mar 20 00:30:02 crc kubenswrapper[4867]: I0320 00:30:02.632259 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw2ck\" (UniqueName: \"kubernetes.io/projected/23fa5f63-fbfa-42f5-87e6-e491a41e40b5-kube-api-access-cw2ck\") pod \"23fa5f63-fbfa-42f5-87e6-e491a41e40b5\" (UID: \"23fa5f63-fbfa-42f5-87e6-e491a41e40b5\") " Mar 20 00:30:02 crc kubenswrapper[4867]: I0320 00:30:02.632318 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23fa5f63-fbfa-42f5-87e6-e491a41e40b5-config-volume\") pod \"23fa5f63-fbfa-42f5-87e6-e491a41e40b5\" (UID: \"23fa5f63-fbfa-42f5-87e6-e491a41e40b5\") " Mar 20 00:30:02 crc kubenswrapper[4867]: I0320 00:30:02.632413 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23fa5f63-fbfa-42f5-87e6-e491a41e40b5-secret-volume\") pod \"23fa5f63-fbfa-42f5-87e6-e491a41e40b5\" (UID: \"23fa5f63-fbfa-42f5-87e6-e491a41e40b5\") " Mar 20 00:30:02 crc kubenswrapper[4867]: I0320 00:30:02.633193 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23fa5f63-fbfa-42f5-87e6-e491a41e40b5-config-volume" (OuterVolumeSpecName: "config-volume") pod "23fa5f63-fbfa-42f5-87e6-e491a41e40b5" (UID: "23fa5f63-fbfa-42f5-87e6-e491a41e40b5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:30:02 crc kubenswrapper[4867]: I0320 00:30:02.641135 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23fa5f63-fbfa-42f5-87e6-e491a41e40b5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "23fa5f63-fbfa-42f5-87e6-e491a41e40b5" (UID: "23fa5f63-fbfa-42f5-87e6-e491a41e40b5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:30:02 crc kubenswrapper[4867]: I0320 00:30:02.641664 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23fa5f63-fbfa-42f5-87e6-e491a41e40b5-kube-api-access-cw2ck" (OuterVolumeSpecName: "kube-api-access-cw2ck") pod "23fa5f63-fbfa-42f5-87e6-e491a41e40b5" (UID: "23fa5f63-fbfa-42f5-87e6-e491a41e40b5"). InnerVolumeSpecName "kube-api-access-cw2ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:30:02 crc kubenswrapper[4867]: I0320 00:30:02.733538 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23fa5f63-fbfa-42f5-87e6-e491a41e40b5-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:02 crc kubenswrapper[4867]: I0320 00:30:02.733567 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw2ck\" (UniqueName: \"kubernetes.io/projected/23fa5f63-fbfa-42f5-87e6-e491a41e40b5-kube-api-access-cw2ck\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:02 crc kubenswrapper[4867]: I0320 00:30:02.733576 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23fa5f63-fbfa-42f5-87e6-e491a41e40b5-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:03 crc kubenswrapper[4867]: I0320 00:30:03.368250 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd" event={"ID":"23fa5f63-fbfa-42f5-87e6-e491a41e40b5","Type":"ContainerDied","Data":"4794b356511d636b6f807d640444a8dba65a154c223d6933a4cd0eddb400bc6f"} Mar 20 00:30:03 crc kubenswrapper[4867]: I0320 00:30:03.368748 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4794b356511d636b6f807d640444a8dba65a154c223d6933a4cd0eddb400bc6f" Mar 20 00:30:03 crc kubenswrapper[4867]: I0320 00:30:03.368332 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566110-m8vbd" Mar 20 00:30:03 crc kubenswrapper[4867]: I0320 00:30:03.577444 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566110-wcb2q" Mar 20 00:30:03 crc kubenswrapper[4867]: I0320 00:30:03.644677 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2xm4\" (UniqueName: \"kubernetes.io/projected/3874203e-db49-4df9-ad3e-2064bc94dfd3-kube-api-access-g2xm4\") pod \"3874203e-db49-4df9-ad3e-2064bc94dfd3\" (UID: \"3874203e-db49-4df9-ad3e-2064bc94dfd3\") " Mar 20 00:30:03 crc kubenswrapper[4867]: I0320 00:30:03.650102 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3874203e-db49-4df9-ad3e-2064bc94dfd3-kube-api-access-g2xm4" (OuterVolumeSpecName: "kube-api-access-g2xm4") pod "3874203e-db49-4df9-ad3e-2064bc94dfd3" (UID: "3874203e-db49-4df9-ad3e-2064bc94dfd3"). InnerVolumeSpecName "kube-api-access-g2xm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:30:03 crc kubenswrapper[4867]: I0320 00:30:03.746661 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2xm4\" (UniqueName: \"kubernetes.io/projected/3874203e-db49-4df9-ad3e-2064bc94dfd3-kube-api-access-g2xm4\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:04 crc kubenswrapper[4867]: I0320 00:30:04.391906 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566110-wcb2q" event={"ID":"3874203e-db49-4df9-ad3e-2064bc94dfd3","Type":"ContainerDied","Data":"f821fecb2f68c571178a7513f96ec43370165448685b26f424edaf948a509c3f"} Mar 20 00:30:04 crc kubenswrapper[4867]: I0320 00:30:04.392342 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f821fecb2f68c571178a7513f96ec43370165448685b26f424edaf948a509c3f" Mar 20 00:30:04 crc kubenswrapper[4867]: I0320 00:30:04.392595 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566110-wcb2q" Mar 20 00:30:04 crc kubenswrapper[4867]: I0320 00:30:04.638905 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566104-97tkw"] Mar 20 00:30:04 crc kubenswrapper[4867]: I0320 00:30:04.644709 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566104-97tkw"] Mar 20 00:30:06 crc kubenswrapper[4867]: I0320 00:30:06.429352 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c73fdb6-10f4-403a-b900-40f895b971d4" path="/var/lib/kubelet/pods/6c73fdb6-10f4-403a-b900-40f895b971d4/volumes" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.438301 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-74z9j"] Mar 20 00:30:25 crc kubenswrapper[4867]: E0320 00:30:25.439072 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3874203e-db49-4df9-ad3e-2064bc94dfd3" containerName="oc" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.439084 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3874203e-db49-4df9-ad3e-2064bc94dfd3" containerName="oc" Mar 20 00:30:25 crc kubenswrapper[4867]: E0320 00:30:25.439099 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23fa5f63-fbfa-42f5-87e6-e491a41e40b5" containerName="collect-profiles" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.439130 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fa5f63-fbfa-42f5-87e6-e491a41e40b5" containerName="collect-profiles" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.439245 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="23fa5f63-fbfa-42f5-87e6-e491a41e40b5" containerName="collect-profiles" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.439255 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3874203e-db49-4df9-ad3e-2064bc94dfd3" containerName="oc" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.440007 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74z9j" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.462831 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-74z9j"] Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.557009 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnzls\" (UniqueName: \"kubernetes.io/projected/840438e9-a814-4321-ae67-c34e7854ff8f-kube-api-access-pnzls\") pod \"certified-operators-74z9j\" (UID: \"840438e9-a814-4321-ae67-c34e7854ff8f\") " pod="openshift-marketplace/certified-operators-74z9j" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.557314 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840438e9-a814-4321-ae67-c34e7854ff8f-utilities\") pod \"certified-operators-74z9j\" (UID: \"840438e9-a814-4321-ae67-c34e7854ff8f\") " pod="openshift-marketplace/certified-operators-74z9j" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.557945 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840438e9-a814-4321-ae67-c34e7854ff8f-catalog-content\") pod \"certified-operators-74z9j\" (UID: \"840438e9-a814-4321-ae67-c34e7854ff8f\") " pod="openshift-marketplace/certified-operators-74z9j" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.637284 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7wmfr"] Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.638589 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7wmfr" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.651636 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7wmfr"] Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.659563 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnzls\" (UniqueName: \"kubernetes.io/projected/840438e9-a814-4321-ae67-c34e7854ff8f-kube-api-access-pnzls\") pod \"certified-operators-74z9j\" (UID: \"840438e9-a814-4321-ae67-c34e7854ff8f\") " pod="openshift-marketplace/certified-operators-74z9j" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.659869 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840438e9-a814-4321-ae67-c34e7854ff8f-utilities\") pod \"certified-operators-74z9j\" (UID: \"840438e9-a814-4321-ae67-c34e7854ff8f\") " pod="openshift-marketplace/certified-operators-74z9j" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.660546 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840438e9-a814-4321-ae67-c34e7854ff8f-utilities\") pod \"certified-operators-74z9j\" (UID: \"840438e9-a814-4321-ae67-c34e7854ff8f\") " pod="openshift-marketplace/certified-operators-74z9j" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.660609 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840438e9-a814-4321-ae67-c34e7854ff8f-catalog-content\") pod \"certified-operators-74z9j\" (UID: \"840438e9-a814-4321-ae67-c34e7854ff8f\") " pod="openshift-marketplace/certified-operators-74z9j" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.662694 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840438e9-a814-4321-ae67-c34e7854ff8f-catalog-content\") pod \"certified-operators-74z9j\" (UID: \"840438e9-a814-4321-ae67-c34e7854ff8f\") " pod="openshift-marketplace/certified-operators-74z9j" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.694945 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnzls\" (UniqueName: \"kubernetes.io/projected/840438e9-a814-4321-ae67-c34e7854ff8f-kube-api-access-pnzls\") pod \"certified-operators-74z9j\" (UID: \"840438e9-a814-4321-ae67-c34e7854ff8f\") " pod="openshift-marketplace/certified-operators-74z9j" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.757299 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74z9j" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.764104 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a9e224c-9347-46a5-ab5c-9b9240991528-utilities\") pod \"redhat-operators-7wmfr\" (UID: \"5a9e224c-9347-46a5-ab5c-9b9240991528\") " pod="openshift-marketplace/redhat-operators-7wmfr" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.764346 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfvk5\" (UniqueName: \"kubernetes.io/projected/5a9e224c-9347-46a5-ab5c-9b9240991528-kube-api-access-hfvk5\") pod \"redhat-operators-7wmfr\" (UID: \"5a9e224c-9347-46a5-ab5c-9b9240991528\") " pod="openshift-marketplace/redhat-operators-7wmfr" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.764473 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a9e224c-9347-46a5-ab5c-9b9240991528-catalog-content\") pod \"redhat-operators-7wmfr\" (UID: \"5a9e224c-9347-46a5-ab5c-9b9240991528\") " pod="openshift-marketplace/redhat-operators-7wmfr" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.868514 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a9e224c-9347-46a5-ab5c-9b9240991528-catalog-content\") pod \"redhat-operators-7wmfr\" (UID: \"5a9e224c-9347-46a5-ab5c-9b9240991528\") " pod="openshift-marketplace/redhat-operators-7wmfr" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.868578 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a9e224c-9347-46a5-ab5c-9b9240991528-utilities\") pod \"redhat-operators-7wmfr\" (UID: \"5a9e224c-9347-46a5-ab5c-9b9240991528\") " pod="openshift-marketplace/redhat-operators-7wmfr" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.868673 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfvk5\" (UniqueName: \"kubernetes.io/projected/5a9e224c-9347-46a5-ab5c-9b9240991528-kube-api-access-hfvk5\") pod \"redhat-operators-7wmfr\" (UID: \"5a9e224c-9347-46a5-ab5c-9b9240991528\") " pod="openshift-marketplace/redhat-operators-7wmfr" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.869034 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a9e224c-9347-46a5-ab5c-9b9240991528-catalog-content\") pod \"redhat-operators-7wmfr\" (UID: \"5a9e224c-9347-46a5-ab5c-9b9240991528\") " pod="openshift-marketplace/redhat-operators-7wmfr" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.869084 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a9e224c-9347-46a5-ab5c-9b9240991528-utilities\") pod \"redhat-operators-7wmfr\" (UID: \"5a9e224c-9347-46a5-ab5c-9b9240991528\") " pod="openshift-marketplace/redhat-operators-7wmfr" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.891403 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfvk5\" (UniqueName: \"kubernetes.io/projected/5a9e224c-9347-46a5-ab5c-9b9240991528-kube-api-access-hfvk5\") pod \"redhat-operators-7wmfr\" (UID: \"5a9e224c-9347-46a5-ab5c-9b9240991528\") " pod="openshift-marketplace/redhat-operators-7wmfr" Mar 20 00:30:25 crc kubenswrapper[4867]: I0320 00:30:25.953359 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7wmfr" Mar 20 00:30:26 crc kubenswrapper[4867]: I0320 00:30:26.071832 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-74z9j"] Mar 20 00:30:26 crc kubenswrapper[4867]: I0320 00:30:26.365909 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7wmfr"] Mar 20 00:30:26 crc kubenswrapper[4867]: W0320 00:30:26.392360 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a9e224c_9347_46a5_ab5c_9b9240991528.slice/crio-619272f2244c5306fd1a86bcefeed4d2077b56ff99e981e0d0a3d5b5e55867ad WatchSource:0}: Error finding container 619272f2244c5306fd1a86bcefeed4d2077b56ff99e981e0d0a3d5b5e55867ad: Status 404 returned error can't find the container with id 619272f2244c5306fd1a86bcefeed4d2077b56ff99e981e0d0a3d5b5e55867ad Mar 20 00:30:26 crc kubenswrapper[4867]: I0320 00:30:26.591673 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74z9j" event={"ID":"840438e9-a814-4321-ae67-c34e7854ff8f","Type":"ContainerStarted","Data":"7d5deb1cdb61db724f2e31c0f9599819d58ce2e9428facc44c54f3ef2eaef5d0"} Mar 20 00:30:26 crc kubenswrapper[4867]: I0320 00:30:26.594122 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wmfr" event={"ID":"5a9e224c-9347-46a5-ab5c-9b9240991528","Type":"ContainerStarted","Data":"619272f2244c5306fd1a86bcefeed4d2077b56ff99e981e0d0a3d5b5e55867ad"} Mar 20 00:30:27 crc kubenswrapper[4867]: I0320 00:30:27.604470 4867 generic.go:334] "Generic (PLEG): container finished" podID="840438e9-a814-4321-ae67-c34e7854ff8f" containerID="7671a55d83fbc2055b8063b93824e73aecd741bcdeecdc1464645691f5de38b5" exitCode=0 Mar 20 00:30:27 crc kubenswrapper[4867]: I0320 00:30:27.604622 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74z9j" event={"ID":"840438e9-a814-4321-ae67-c34e7854ff8f","Type":"ContainerDied","Data":"7671a55d83fbc2055b8063b93824e73aecd741bcdeecdc1464645691f5de38b5"} Mar 20 00:30:27 crc kubenswrapper[4867]: I0320 00:30:27.608036 4867 generic.go:334] "Generic (PLEG): container finished" podID="5a9e224c-9347-46a5-ab5c-9b9240991528" containerID="4f91f44cb91e4afba910a07867119afc94aa3df6235e1c90c0dfb5fb7f6dc078" exitCode=0 Mar 20 00:30:27 crc kubenswrapper[4867]: I0320 00:30:27.608068 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wmfr" event={"ID":"5a9e224c-9347-46a5-ab5c-9b9240991528","Type":"ContainerDied","Data":"4f91f44cb91e4afba910a07867119afc94aa3df6235e1c90c0dfb5fb7f6dc078"} Mar 20 00:30:28 crc kubenswrapper[4867]: I0320 00:30:28.618187 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74z9j" event={"ID":"840438e9-a814-4321-ae67-c34e7854ff8f","Type":"ContainerStarted","Data":"c5e587bca0c294bbaff3fd7663fbc9cfaa54b6afaae8f7e22d15928fe500438b"} Mar 20 00:30:29 crc kubenswrapper[4867]: I0320 00:30:29.628834 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wmfr" event={"ID":"5a9e224c-9347-46a5-ab5c-9b9240991528","Type":"ContainerStarted","Data":"0d11c0476b0f2a4c37191aac736c3be1a04548498cc087a22ecb3e440ccfa560"} Mar 20 00:30:29 crc kubenswrapper[4867]: I0320 00:30:29.631908 4867 generic.go:334] "Generic (PLEG): container finished" podID="840438e9-a814-4321-ae67-c34e7854ff8f" containerID="c5e587bca0c294bbaff3fd7663fbc9cfaa54b6afaae8f7e22d15928fe500438b" exitCode=0 Mar 20 00:30:29 crc kubenswrapper[4867]: I0320 00:30:29.631956 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74z9j" event={"ID":"840438e9-a814-4321-ae67-c34e7854ff8f","Type":"ContainerDied","Data":"c5e587bca0c294bbaff3fd7663fbc9cfaa54b6afaae8f7e22d15928fe500438b"} Mar 20 00:30:30 crc kubenswrapper[4867]: I0320 00:30:30.639294 4867 generic.go:334] "Generic (PLEG): container finished" podID="5a9e224c-9347-46a5-ab5c-9b9240991528" containerID="0d11c0476b0f2a4c37191aac736c3be1a04548498cc087a22ecb3e440ccfa560" exitCode=0 Mar 20 00:30:30 crc kubenswrapper[4867]: I0320 00:30:30.639685 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wmfr" event={"ID":"5a9e224c-9347-46a5-ab5c-9b9240991528","Type":"ContainerDied","Data":"0d11c0476b0f2a4c37191aac736c3be1a04548498cc087a22ecb3e440ccfa560"} Mar 20 00:30:30 crc kubenswrapper[4867]: I0320 00:30:30.643993 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74z9j" event={"ID":"840438e9-a814-4321-ae67-c34e7854ff8f","Type":"ContainerStarted","Data":"550c3aef11b0128302f80a86fa2699e911ee0da2edc672193e492ffc69a6e2e1"} Mar 20 00:30:30 crc kubenswrapper[4867]: I0320 00:30:30.677667 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-74z9j" podStartSLOduration=3.08068675 podStartE2EDuration="5.677648887s" podCreationTimestamp="2026-03-20 00:30:25 +0000 UTC" firstStartedPulling="2026-03-20 00:30:27.606522347 +0000 UTC m=+1441.833059874" lastFinishedPulling="2026-03-20 00:30:30.203484454 +0000 UTC m=+1444.430022011" observedRunningTime="2026-03-20 00:30:30.672143254 +0000 UTC m=+1444.898680821" watchObservedRunningTime="2026-03-20 00:30:30.677648887 +0000 UTC m=+1444.904186404" Mar 20 00:30:31 crc kubenswrapper[4867]: I0320 00:30:31.652783 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wmfr" event={"ID":"5a9e224c-9347-46a5-ab5c-9b9240991528","Type":"ContainerStarted","Data":"7f4061883a442239f281ad21280c66e2cc4169e07f00c00d7e9955dba7b2f6aa"} Mar 20 00:30:31 crc kubenswrapper[4867]: I0320 00:30:31.672574 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7wmfr" podStartSLOduration=4.246182887 podStartE2EDuration="6.672550969s" podCreationTimestamp="2026-03-20 00:30:25 +0000 UTC" firstStartedPulling="2026-03-20 00:30:28.619522679 +0000 UTC m=+1442.846060196" lastFinishedPulling="2026-03-20 00:30:31.045890761 +0000 UTC m=+1445.272428278" observedRunningTime="2026-03-20 00:30:31.668919975 +0000 UTC m=+1445.895457492" watchObservedRunningTime="2026-03-20 00:30:31.672550969 +0000 UTC m=+1445.899088486" Mar 20 00:30:35 crc kubenswrapper[4867]: I0320 00:30:35.757635 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-74z9j" Mar 20 00:30:35 crc kubenswrapper[4867]: I0320 00:30:35.758683 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-74z9j" Mar 20 00:30:35 crc kubenswrapper[4867]: I0320 00:30:35.813686 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-74z9j" Mar 20 00:30:35 crc kubenswrapper[4867]: I0320 00:30:35.954024 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7wmfr" Mar 20 00:30:35 crc kubenswrapper[4867]: I0320 00:30:35.954118 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7wmfr" Mar 20 00:30:36 crc kubenswrapper[4867]: I0320 00:30:36.747231 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-74z9j" Mar 20 00:30:36 crc kubenswrapper[4867]: I0320 00:30:36.991283 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7wmfr" podUID="5a9e224c-9347-46a5-ab5c-9b9240991528" containerName="registry-server" probeResult="failure" output=< Mar 20 00:30:36 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Mar 20 00:30:36 crc kubenswrapper[4867]: > Mar 20 00:30:39 crc kubenswrapper[4867]: I0320 00:30:39.429603 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-74z9j"] Mar 20 00:30:39 crc kubenswrapper[4867]: I0320 00:30:39.430279 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-74z9j" podUID="840438e9-a814-4321-ae67-c34e7854ff8f" containerName="registry-server" containerID="cri-o://550c3aef11b0128302f80a86fa2699e911ee0da2edc672193e492ffc69a6e2e1" gracePeriod=2 Mar 20 00:30:39 crc kubenswrapper[4867]: I0320 00:30:39.718475 4867 generic.go:334] "Generic (PLEG): container finished" podID="840438e9-a814-4321-ae67-c34e7854ff8f" containerID="550c3aef11b0128302f80a86fa2699e911ee0da2edc672193e492ffc69a6e2e1" exitCode=0 Mar 20 00:30:39 crc kubenswrapper[4867]: I0320 00:30:39.718538 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74z9j" event={"ID":"840438e9-a814-4321-ae67-c34e7854ff8f","Type":"ContainerDied","Data":"550c3aef11b0128302f80a86fa2699e911ee0da2edc672193e492ffc69a6e2e1"} Mar 20 00:30:39 crc kubenswrapper[4867]: I0320 00:30:39.817485 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74z9j" Mar 20 00:30:39 crc kubenswrapper[4867]: I0320 00:30:39.979058 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840438e9-a814-4321-ae67-c34e7854ff8f-catalog-content\") pod \"840438e9-a814-4321-ae67-c34e7854ff8f\" (UID: \"840438e9-a814-4321-ae67-c34e7854ff8f\") " Mar 20 00:30:39 crc kubenswrapper[4867]: I0320 00:30:39.979127 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840438e9-a814-4321-ae67-c34e7854ff8f-utilities\") pod \"840438e9-a814-4321-ae67-c34e7854ff8f\" (UID: \"840438e9-a814-4321-ae67-c34e7854ff8f\") " Mar 20 00:30:39 crc kubenswrapper[4867]: I0320 00:30:39.979280 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnzls\" (UniqueName: \"kubernetes.io/projected/840438e9-a814-4321-ae67-c34e7854ff8f-kube-api-access-pnzls\") pod \"840438e9-a814-4321-ae67-c34e7854ff8f\" (UID: \"840438e9-a814-4321-ae67-c34e7854ff8f\") " Mar 20 00:30:39 crc kubenswrapper[4867]: I0320 00:30:39.982380 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/840438e9-a814-4321-ae67-c34e7854ff8f-utilities" (OuterVolumeSpecName: "utilities") pod "840438e9-a814-4321-ae67-c34e7854ff8f" (UID: "840438e9-a814-4321-ae67-c34e7854ff8f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:30:39 crc kubenswrapper[4867]: I0320 00:30:39.992775 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/840438e9-a814-4321-ae67-c34e7854ff8f-kube-api-access-pnzls" (OuterVolumeSpecName: "kube-api-access-pnzls") pod "840438e9-a814-4321-ae67-c34e7854ff8f" (UID: "840438e9-a814-4321-ae67-c34e7854ff8f"). InnerVolumeSpecName "kube-api-access-pnzls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:30:40 crc kubenswrapper[4867]: I0320 00:30:40.032412 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/840438e9-a814-4321-ae67-c34e7854ff8f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "840438e9-a814-4321-ae67-c34e7854ff8f" (UID: "840438e9-a814-4321-ae67-c34e7854ff8f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:30:40 crc kubenswrapper[4867]: I0320 00:30:40.080406 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840438e9-a814-4321-ae67-c34e7854ff8f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:40 crc kubenswrapper[4867]: I0320 00:30:40.080434 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840438e9-a814-4321-ae67-c34e7854ff8f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:40 crc kubenswrapper[4867]: I0320 00:30:40.080445 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnzls\" (UniqueName: \"kubernetes.io/projected/840438e9-a814-4321-ae67-c34e7854ff8f-kube-api-access-pnzls\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:40 crc kubenswrapper[4867]: I0320 00:30:40.730095 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74z9j" event={"ID":"840438e9-a814-4321-ae67-c34e7854ff8f","Type":"ContainerDied","Data":"7d5deb1cdb61db724f2e31c0f9599819d58ce2e9428facc44c54f3ef2eaef5d0"} Mar 20 00:30:40 crc kubenswrapper[4867]: I0320 00:30:40.730166 4867 scope.go:117] "RemoveContainer" containerID="550c3aef11b0128302f80a86fa2699e911ee0da2edc672193e492ffc69a6e2e1" Mar 20 00:30:40 crc kubenswrapper[4867]: I0320 00:30:40.730221 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74z9j" Mar 20 00:30:40 crc kubenswrapper[4867]: I0320 00:30:40.750863 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-74z9j"] Mar 20 00:30:40 crc kubenswrapper[4867]: I0320 00:30:40.754541 4867 scope.go:117] "RemoveContainer" containerID="c5e587bca0c294bbaff3fd7663fbc9cfaa54b6afaae8f7e22d15928fe500438b" Mar 20 00:30:40 crc kubenswrapper[4867]: I0320 00:30:40.759974 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-74z9j"] Mar 20 00:30:40 crc kubenswrapper[4867]: I0320 00:30:40.785123 4867 scope.go:117] "RemoveContainer" containerID="7671a55d83fbc2055b8063b93824e73aecd741bcdeecdc1464645691f5de38b5" Mar 20 00:30:42 crc kubenswrapper[4867]: I0320 00:30:42.437120 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="840438e9-a814-4321-ae67-c34e7854ff8f" path="/var/lib/kubelet/pods/840438e9-a814-4321-ae67-c34e7854ff8f/volumes" Mar 20 00:30:46 crc kubenswrapper[4867]: I0320 00:30:46.010631 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7wmfr" Mar 20 00:30:46 crc kubenswrapper[4867]: I0320 00:30:46.060663 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7wmfr" Mar 20 00:30:48 crc kubenswrapper[4867]: I0320 00:30:48.626802 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7wmfr"] Mar 20 00:30:48 crc kubenswrapper[4867]: I0320 00:30:48.627399 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7wmfr" podUID="5a9e224c-9347-46a5-ab5c-9b9240991528" containerName="registry-server" containerID="cri-o://7f4061883a442239f281ad21280c66e2cc4169e07f00c00d7e9955dba7b2f6aa" gracePeriod=2 Mar 20 00:30:48 crc kubenswrapper[4867]: I0320 00:30:48.782463 4867 generic.go:334] "Generic (PLEG): container finished" podID="c17a156c-a61a-4abe-a8c9-e4915b98f3f9" containerID="4e51dc4a5cd97d5bb1cbbec85028a8a15c477a5d579182e1289aed2810747361" exitCode=0 Mar 20 00:30:48 crc kubenswrapper[4867]: I0320 00:30:48.782572 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"c17a156c-a61a-4abe-a8c9-e4915b98f3f9","Type":"ContainerDied","Data":"4e51dc4a5cd97d5bb1cbbec85028a8a15c477a5d579182e1289aed2810747361"} Mar 20 00:30:48 crc kubenswrapper[4867]: I0320 00:30:48.785722 4867 generic.go:334] "Generic (PLEG): container finished" podID="5a9e224c-9347-46a5-ab5c-9b9240991528" containerID="7f4061883a442239f281ad21280c66e2cc4169e07f00c00d7e9955dba7b2f6aa" exitCode=0 Mar 20 00:30:48 crc kubenswrapper[4867]: I0320 00:30:48.785777 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wmfr" event={"ID":"5a9e224c-9347-46a5-ab5c-9b9240991528","Type":"ContainerDied","Data":"7f4061883a442239f281ad21280c66e2cc4169e07f00c00d7e9955dba7b2f6aa"} Mar 20 00:30:49 crc kubenswrapper[4867]: I0320 00:30:49.033976 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7wmfr" Mar 20 00:30:49 crc kubenswrapper[4867]: I0320 00:30:49.202732 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a9e224c-9347-46a5-ab5c-9b9240991528-catalog-content\") pod \"5a9e224c-9347-46a5-ab5c-9b9240991528\" (UID: \"5a9e224c-9347-46a5-ab5c-9b9240991528\") " Mar 20 00:30:49 crc kubenswrapper[4867]: I0320 00:30:49.202787 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a9e224c-9347-46a5-ab5c-9b9240991528-utilities\") pod \"5a9e224c-9347-46a5-ab5c-9b9240991528\" (UID: \"5a9e224c-9347-46a5-ab5c-9b9240991528\") " Mar 20 00:30:49 crc kubenswrapper[4867]: I0320 00:30:49.202831 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfvk5\" (UniqueName: \"kubernetes.io/projected/5a9e224c-9347-46a5-ab5c-9b9240991528-kube-api-access-hfvk5\") pod \"5a9e224c-9347-46a5-ab5c-9b9240991528\" (UID: \"5a9e224c-9347-46a5-ab5c-9b9240991528\") " Mar 20 00:30:49 crc kubenswrapper[4867]: I0320 00:30:49.204472 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a9e224c-9347-46a5-ab5c-9b9240991528-utilities" (OuterVolumeSpecName: "utilities") pod "5a9e224c-9347-46a5-ab5c-9b9240991528" (UID: "5a9e224c-9347-46a5-ab5c-9b9240991528"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:30:49 crc kubenswrapper[4867]: I0320 00:30:49.214875 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a9e224c-9347-46a5-ab5c-9b9240991528-kube-api-access-hfvk5" (OuterVolumeSpecName: "kube-api-access-hfvk5") pod "5a9e224c-9347-46a5-ab5c-9b9240991528" (UID: "5a9e224c-9347-46a5-ab5c-9b9240991528"). InnerVolumeSpecName "kube-api-access-hfvk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:30:49 crc kubenswrapper[4867]: I0320 00:30:49.305325 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfvk5\" (UniqueName: \"kubernetes.io/projected/5a9e224c-9347-46a5-ab5c-9b9240991528-kube-api-access-hfvk5\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:49 crc kubenswrapper[4867]: I0320 00:30:49.305398 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a9e224c-9347-46a5-ab5c-9b9240991528-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:49 crc kubenswrapper[4867]: I0320 00:30:49.371865 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a9e224c-9347-46a5-ab5c-9b9240991528-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a9e224c-9347-46a5-ab5c-9b9240991528" (UID: "5a9e224c-9347-46a5-ab5c-9b9240991528"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:30:49 crc kubenswrapper[4867]: I0320 00:30:49.407620 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a9e224c-9347-46a5-ab5c-9b9240991528-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:49 crc kubenswrapper[4867]: I0320 00:30:49.799397 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7wmfr" Mar 20 00:30:49 crc kubenswrapper[4867]: I0320 00:30:49.799417 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wmfr" event={"ID":"5a9e224c-9347-46a5-ab5c-9b9240991528","Type":"ContainerDied","Data":"619272f2244c5306fd1a86bcefeed4d2077b56ff99e981e0d0a3d5b5e55867ad"} Mar 20 00:30:49 crc kubenswrapper[4867]: I0320 00:30:49.799571 4867 scope.go:117] "RemoveContainer" containerID="7f4061883a442239f281ad21280c66e2cc4169e07f00c00d7e9955dba7b2f6aa" Mar 20 00:30:49 crc kubenswrapper[4867]: I0320 00:30:49.849665 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7wmfr"] Mar 20 00:30:49 crc kubenswrapper[4867]: I0320 00:30:49.856732 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7wmfr"] Mar 20 00:30:49 crc kubenswrapper[4867]: I0320 00:30:49.862233 4867 scope.go:117] "RemoveContainer" containerID="0d11c0476b0f2a4c37191aac736c3be1a04548498cc087a22ecb3e440ccfa560" Mar 20 00:30:49 crc kubenswrapper[4867]: I0320 00:30:49.904293 4867 scope.go:117] "RemoveContainer" containerID="4f91f44cb91e4afba910a07867119afc94aa3df6235e1c90c0dfb5fb7f6dc078" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.126064 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.219530 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-buildcachedir\") pod \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.219592 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-proxy-ca-bundles\") pod \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.219641 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-blob-cache\") pod \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.219639 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "c17a156c-a61a-4abe-a8c9-e4915b98f3f9" (UID: "c17a156c-a61a-4abe-a8c9-e4915b98f3f9"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.219672 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfz79\" (UniqueName: \"kubernetes.io/projected/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-kube-api-access-kfz79\") pod \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.219773 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-container-storage-run\") pod \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.219848 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-system-configs\") pod \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.219902 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-node-pullsecrets\") pod \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.219927 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-builder-dockercfg-526rz-pull\") pod \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.219980 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-buildworkdir\") pod \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.220011 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-ca-bundles\") pod \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.220021 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "c17a156c-a61a-4abe-a8c9-e4915b98f3f9" (UID: "c17a156c-a61a-4abe-a8c9-e4915b98f3f9"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.220051 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-builder-dockercfg-526rz-push\") pod \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.220078 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-container-storage-root\") pod \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\" (UID: \"c17a156c-a61a-4abe-a8c9-e4915b98f3f9\") " Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.220428 4867 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.220446 4867 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.220477 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "c17a156c-a61a-4abe-a8c9-e4915b98f3f9" (UID: "c17a156c-a61a-4abe-a8c9-e4915b98f3f9"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.220542 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "c17a156c-a61a-4abe-a8c9-e4915b98f3f9" (UID: "c17a156c-a61a-4abe-a8c9-e4915b98f3f9"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.221034 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "c17a156c-a61a-4abe-a8c9-e4915b98f3f9" (UID: "c17a156c-a61a-4abe-a8c9-e4915b98f3f9"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.221075 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "c17a156c-a61a-4abe-a8c9-e4915b98f3f9" (UID: "c17a156c-a61a-4abe-a8c9-e4915b98f3f9"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.223208 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-kube-api-access-kfz79" (OuterVolumeSpecName: "kube-api-access-kfz79") pod "c17a156c-a61a-4abe-a8c9-e4915b98f3f9" (UID: "c17a156c-a61a-4abe-a8c9-e4915b98f3f9"). InnerVolumeSpecName "kube-api-access-kfz79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.224232 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "c17a156c-a61a-4abe-a8c9-e4915b98f3f9" (UID: "c17a156c-a61a-4abe-a8c9-e4915b98f3f9"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.224860 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-builder-dockercfg-526rz-pull" (OuterVolumeSpecName: "builder-dockercfg-526rz-pull") pod "c17a156c-a61a-4abe-a8c9-e4915b98f3f9" (UID: "c17a156c-a61a-4abe-a8c9-e4915b98f3f9"). InnerVolumeSpecName "builder-dockercfg-526rz-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.225664 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-builder-dockercfg-526rz-push" (OuterVolumeSpecName: "builder-dockercfg-526rz-push") pod "c17a156c-a61a-4abe-a8c9-e4915b98f3f9" (UID: "c17a156c-a61a-4abe-a8c9-e4915b98f3f9"). InnerVolumeSpecName "builder-dockercfg-526rz-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.322191 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.322760 4867 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.322776 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-pull\" (UniqueName: \"kubernetes.io/secret/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-builder-dockercfg-526rz-pull\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.322793 4867 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.322807 4867 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.322820 4867 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-526rz-push\" (UniqueName: \"kubernetes.io/secret/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-builder-dockercfg-526rz-push\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.322834 4867 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.322847 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfz79\" (UniqueName: \"kubernetes.io/projected/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-kube-api-access-kfz79\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.334327 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "c17a156c-a61a-4abe-a8c9-e4915b98f3f9" (UID: "c17a156c-a61a-4abe-a8c9-e4915b98f3f9"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.423720 4867 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.465606 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a9e224c-9347-46a5-ab5c-9b9240991528" path="/var/lib/kubelet/pods/5a9e224c-9347-46a5-ab5c-9b9240991528/volumes" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.809725 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"c17a156c-a61a-4abe-a8c9-e4915b98f3f9","Type":"ContainerDied","Data":"b3ad0174fc7537d50adaf1533d38a2088bd39936cecc35981f49c81fbe8618e8"} Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.809776 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3ad0174fc7537d50adaf1533d38a2088bd39936cecc35981f49c81fbe8618e8" Mar 20 00:30:50 crc kubenswrapper[4867]: I0320 00:30:50.809892 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 20 00:30:51 crc kubenswrapper[4867]: I0320 00:30:51.262438 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "c17a156c-a61a-4abe-a8c9-e4915b98f3f9" (UID: "c17a156c-a61a-4abe-a8c9-e4915b98f3f9"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:30:51 crc kubenswrapper[4867]: I0320 00:30:51.336452 4867 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c17a156c-a61a-4abe-a8c9-e4915b98f3f9-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 20 00:30:54 crc kubenswrapper[4867]: I0320 00:30:54.788903 4867 scope.go:117] "RemoveContainer" containerID="b795522ad0f5eed5adb8d6a5e87115e9282d1a3c8488e314fcc939a2405a5dbb" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.451418 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-5fff4dbc4c-bpbl5"] Mar 20 00:31:03 crc kubenswrapper[4867]: E0320 00:31:03.452993 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9e224c-9347-46a5-ab5c-9b9240991528" containerName="extract-utilities" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.453106 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9e224c-9347-46a5-ab5c-9b9240991528" containerName="extract-utilities" Mar 20 00:31:03 crc kubenswrapper[4867]: E0320 00:31:03.453171 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840438e9-a814-4321-ae67-c34e7854ff8f" containerName="registry-server" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.453239 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="840438e9-a814-4321-ae67-c34e7854ff8f" containerName="registry-server" Mar 20 00:31:03 crc kubenswrapper[4867]: E0320 00:31:03.453313 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9e224c-9347-46a5-ab5c-9b9240991528" containerName="registry-server" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.453372 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9e224c-9347-46a5-ab5c-9b9240991528" containerName="registry-server" Mar 20 00:31:03 crc kubenswrapper[4867]: E0320 00:31:03.453452 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9e224c-9347-46a5-ab5c-9b9240991528" containerName="extract-content" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.453527 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9e224c-9347-46a5-ab5c-9b9240991528" containerName="extract-content" Mar 20 00:31:03 crc kubenswrapper[4867]: E0320 00:31:03.453611 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17a156c-a61a-4abe-a8c9-e4915b98f3f9" containerName="docker-build" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.453660 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17a156c-a61a-4abe-a8c9-e4915b98f3f9" containerName="docker-build" Mar 20 00:31:03 crc kubenswrapper[4867]: E0320 00:31:03.453713 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17a156c-a61a-4abe-a8c9-e4915b98f3f9" containerName="git-clone" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.453773 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17a156c-a61a-4abe-a8c9-e4915b98f3f9" containerName="git-clone" Mar 20 00:31:03 crc kubenswrapper[4867]: E0320 00:31:03.453840 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17a156c-a61a-4abe-a8c9-e4915b98f3f9" containerName="manage-dockerfile" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.453891 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17a156c-a61a-4abe-a8c9-e4915b98f3f9" containerName="manage-dockerfile" Mar 20 00:31:03 crc kubenswrapper[4867]: E0320 00:31:03.453947 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840438e9-a814-4321-ae67-c34e7854ff8f" containerName="extract-utilities" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.454014 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="840438e9-a814-4321-ae67-c34e7854ff8f" containerName="extract-utilities" Mar 20 00:31:03 crc kubenswrapper[4867]: E0320 00:31:03.454073 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840438e9-a814-4321-ae67-c34e7854ff8f" containerName="extract-content" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.454127 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="840438e9-a814-4321-ae67-c34e7854ff8f" containerName="extract-content" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.454279 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a9e224c-9347-46a5-ab5c-9b9240991528" containerName="registry-server" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.454345 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c17a156c-a61a-4abe-a8c9-e4915b98f3f9" containerName="docker-build" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.454403 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="840438e9-a814-4321-ae67-c34e7854ff8f" containerName="registry-server" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.454846 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-5fff4dbc4c-bpbl5" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.457246 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-z7snb" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.470094 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-5fff4dbc4c-bpbl5"] Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.586435 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zprcf\" (UniqueName: \"kubernetes.io/projected/7d8af434-ddb1-4216-bf3a-d123252c4325-kube-api-access-zprcf\") pod \"smart-gateway-operator-5fff4dbc4c-bpbl5\" (UID: \"7d8af434-ddb1-4216-bf3a-d123252c4325\") " pod="service-telemetry/smart-gateway-operator-5fff4dbc4c-bpbl5" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.586514 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/7d8af434-ddb1-4216-bf3a-d123252c4325-runner\") pod \"smart-gateway-operator-5fff4dbc4c-bpbl5\" (UID: \"7d8af434-ddb1-4216-bf3a-d123252c4325\") " pod="service-telemetry/smart-gateway-operator-5fff4dbc4c-bpbl5" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.688257 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/7d8af434-ddb1-4216-bf3a-d123252c4325-runner\") pod \"smart-gateway-operator-5fff4dbc4c-bpbl5\" (UID: \"7d8af434-ddb1-4216-bf3a-d123252c4325\") " pod="service-telemetry/smart-gateway-operator-5fff4dbc4c-bpbl5" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.688481 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zprcf\" (UniqueName: \"kubernetes.io/projected/7d8af434-ddb1-4216-bf3a-d123252c4325-kube-api-access-zprcf\") pod \"smart-gateway-operator-5fff4dbc4c-bpbl5\" (UID: \"7d8af434-ddb1-4216-bf3a-d123252c4325\") " pod="service-telemetry/smart-gateway-operator-5fff4dbc4c-bpbl5" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.688993 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/7d8af434-ddb1-4216-bf3a-d123252c4325-runner\") pod \"smart-gateway-operator-5fff4dbc4c-bpbl5\" (UID: \"7d8af434-ddb1-4216-bf3a-d123252c4325\") " pod="service-telemetry/smart-gateway-operator-5fff4dbc4c-bpbl5" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.710069 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zprcf\" (UniqueName: \"kubernetes.io/projected/7d8af434-ddb1-4216-bf3a-d123252c4325-kube-api-access-zprcf\") pod \"smart-gateway-operator-5fff4dbc4c-bpbl5\" (UID: \"7d8af434-ddb1-4216-bf3a-d123252c4325\") " pod="service-telemetry/smart-gateway-operator-5fff4dbc4c-bpbl5" Mar 20 00:31:03 crc kubenswrapper[4867]: I0320 00:31:03.772340 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-5fff4dbc4c-bpbl5" Mar 20 00:31:04 crc kubenswrapper[4867]: I0320 00:31:04.000273 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-5fff4dbc4c-bpbl5"] Mar 20 00:31:04 crc kubenswrapper[4867]: W0320 00:31:04.016308 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d8af434_ddb1_4216_bf3a_d123252c4325.slice/crio-263a5b1765ef186b22a2729282b44739abde912044a6c361146a1f936a5b0c11 WatchSource:0}: Error finding container 263a5b1765ef186b22a2729282b44739abde912044a6c361146a1f936a5b0c11: Status 404 returned error can't find the container with id 263a5b1765ef186b22a2729282b44739abde912044a6c361146a1f936a5b0c11 Mar 20 00:31:04 crc kubenswrapper[4867]: I0320 00:31:04.913719 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-5fff4dbc4c-bpbl5" event={"ID":"7d8af434-ddb1-4216-bf3a-d123252c4325","Type":"ContainerStarted","Data":"263a5b1765ef186b22a2729282b44739abde912044a6c361146a1f936a5b0c11"} Mar 20 00:31:09 crc kubenswrapper[4867]: I0320 00:31:09.240101 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-7474fb55fb-cmvds"] Mar 20 00:31:09 crc kubenswrapper[4867]: I0320 00:31:09.241990 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-7474fb55fb-cmvds" Mar 20 00:31:09 crc kubenswrapper[4867]: I0320 00:31:09.243813 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-zmxbp" Mar 20 00:31:09 crc kubenswrapper[4867]: I0320 00:31:09.258263 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-7474fb55fb-cmvds"] Mar 20 00:31:09 crc kubenswrapper[4867]: I0320 00:31:09.274159 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d25r\" (UniqueName: \"kubernetes.io/projected/44792f23-3a52-46e0-b91d-1e1853b6437f-kube-api-access-8d25r\") pod \"service-telemetry-operator-7474fb55fb-cmvds\" (UID: \"44792f23-3a52-46e0-b91d-1e1853b6437f\") " pod="service-telemetry/service-telemetry-operator-7474fb55fb-cmvds" Mar 20 00:31:09 crc kubenswrapper[4867]: I0320 00:31:09.274474 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/44792f23-3a52-46e0-b91d-1e1853b6437f-runner\") pod \"service-telemetry-operator-7474fb55fb-cmvds\" (UID: \"44792f23-3a52-46e0-b91d-1e1853b6437f\") " pod="service-telemetry/service-telemetry-operator-7474fb55fb-cmvds" Mar 20 00:31:09 crc kubenswrapper[4867]: I0320 00:31:09.375542 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d25r\" (UniqueName: \"kubernetes.io/projected/44792f23-3a52-46e0-b91d-1e1853b6437f-kube-api-access-8d25r\") pod \"service-telemetry-operator-7474fb55fb-cmvds\" (UID: \"44792f23-3a52-46e0-b91d-1e1853b6437f\") " pod="service-telemetry/service-telemetry-operator-7474fb55fb-cmvds" Mar 20 00:31:09 crc kubenswrapper[4867]: I0320 00:31:09.375638 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/44792f23-3a52-46e0-b91d-1e1853b6437f-runner\") pod \"service-telemetry-operator-7474fb55fb-cmvds\" (UID: \"44792f23-3a52-46e0-b91d-1e1853b6437f\") " pod="service-telemetry/service-telemetry-operator-7474fb55fb-cmvds" Mar 20 00:31:09 crc kubenswrapper[4867]: I0320 00:31:09.376167 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/44792f23-3a52-46e0-b91d-1e1853b6437f-runner\") pod \"service-telemetry-operator-7474fb55fb-cmvds\" (UID: \"44792f23-3a52-46e0-b91d-1e1853b6437f\") " pod="service-telemetry/service-telemetry-operator-7474fb55fb-cmvds" Mar 20 00:31:09 crc kubenswrapper[4867]: I0320 00:31:09.400041 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d25r\" (UniqueName: \"kubernetes.io/projected/44792f23-3a52-46e0-b91d-1e1853b6437f-kube-api-access-8d25r\") pod \"service-telemetry-operator-7474fb55fb-cmvds\" (UID: \"44792f23-3a52-46e0-b91d-1e1853b6437f\") " pod="service-telemetry/service-telemetry-operator-7474fb55fb-cmvds" Mar 20 00:31:09 crc kubenswrapper[4867]: I0320 00:31:09.565742 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-7474fb55fb-cmvds" Mar 20 00:31:13 crc kubenswrapper[4867]: I0320 00:31:13.886714 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-7474fb55fb-cmvds"] Mar 20 00:31:17 crc kubenswrapper[4867]: W0320 00:31:17.607783 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44792f23_3a52_46e0_b91d_1e1853b6437f.slice/crio-d2cf50c511e5c1ddcd442213626a47d43ea7b0194fcb094c5050cde87c780229 WatchSource:0}: Error finding container d2cf50c511e5c1ddcd442213626a47d43ea7b0194fcb094c5050cde87c780229: Status 404 returned error can't find the container with id d2cf50c511e5c1ddcd442213626a47d43ea7b0194fcb094c5050cde87c780229 Mar 20 00:31:17 crc kubenswrapper[4867]: E0320 00:31:17.988643 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/smart-gateway-operator:stable-1.5" Mar 20 00:31:17 crc kubenswrapper[4867]: E0320 00:31:17.988817 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/smart-gateway-operator:stable-1.5,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:smart-gateway-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY_SMARTGATEWAY_SMARTGATEWAY_INFRA_WATCH,Value:4,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DEBUG_LOGS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CORE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BRIDGE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-bridge:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:smart-gateway-operator.v5.0.1773966652,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zprcf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod smart-gateway-operator-5fff4dbc4c-bpbl5_service-telemetry(7d8af434-ddb1-4216-bf3a-d123252c4325): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 00:31:17 crc kubenswrapper[4867]: E0320 00:31:17.990187 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/smart-gateway-operator-5fff4dbc4c-bpbl5" podUID="7d8af434-ddb1-4216-bf3a-d123252c4325" Mar 20 00:31:18 crc kubenswrapper[4867]: I0320 00:31:18.035772 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-7474fb55fb-cmvds" event={"ID":"44792f23-3a52-46e0-b91d-1e1853b6437f","Type":"ContainerStarted","Data":"d2cf50c511e5c1ddcd442213626a47d43ea7b0194fcb094c5050cde87c780229"} Mar 20 00:31:18 crc kubenswrapper[4867]: E0320 00:31:18.037371 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/smart-gateway-operator:stable-1.5\\\"\"" pod="service-telemetry/smart-gateway-operator-5fff4dbc4c-bpbl5" podUID="7d8af434-ddb1-4216-bf3a-d123252c4325" Mar 20 00:31:18 crc kubenswrapper[4867]: I0320 00:31:18.861042 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:31:18 crc kubenswrapper[4867]: I0320 00:31:18.861696 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:31:24 crc kubenswrapper[4867]: I0320 00:31:24.079745 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-7474fb55fb-cmvds" event={"ID":"44792f23-3a52-46e0-b91d-1e1853b6437f","Type":"ContainerStarted","Data":"24f396e27affdd10b3a3786757aadbe0e7f25d57d40ebb7f1bf4d7b4822469c7"} Mar 20 00:31:24 crc kubenswrapper[4867]: I0320 00:31:24.110915 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-7474fb55fb-cmvds" podStartSLOduration=9.76056739 podStartE2EDuration="15.110888172s" podCreationTimestamp="2026-03-20 00:31:09 +0000 UTC" firstStartedPulling="2026-03-20 00:31:17.610267185 +0000 UTC m=+1491.836804712" lastFinishedPulling="2026-03-20 00:31:22.960587977 +0000 UTC m=+1497.187125494" observedRunningTime="2026-03-20 00:31:24.103998423 +0000 UTC m=+1498.330535970" watchObservedRunningTime="2026-03-20 00:31:24.110888172 +0000 UTC m=+1498.337425709" Mar 20 00:31:32 crc kubenswrapper[4867]: I0320 00:31:32.147807 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-5fff4dbc4c-bpbl5" event={"ID":"7d8af434-ddb1-4216-bf3a-d123252c4325","Type":"ContainerStarted","Data":"16298abc43e0bf35fb462ee39685c171b1bff11b3d0cb4d58f1a7c53029f4165"} Mar 20 00:31:32 crc kubenswrapper[4867]: I0320 00:31:32.180930 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-5fff4dbc4c-bpbl5" podStartSLOduration=1.31650938 podStartE2EDuration="29.180903247s" podCreationTimestamp="2026-03-20 00:31:03 +0000 UTC" firstStartedPulling="2026-03-20 00:31:04.018447087 +0000 UTC m=+1478.244984604" lastFinishedPulling="2026-03-20 00:31:31.882840954 +0000 UTC m=+1506.109378471" observedRunningTime="2026-03-20 00:31:32.167100539 +0000 UTC m=+1506.393638066" watchObservedRunningTime="2026-03-20 00:31:32.180903247 +0000 UTC m=+1506.407440774" Mar 20 00:31:48 crc kubenswrapper[4867]: I0320 00:31:48.860037 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:31:48 crc kubenswrapper[4867]: I0320 00:31:48.861023 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.421059 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-txgjt"] Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.422451 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.424756 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-ss48v" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.425099 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.425194 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.425449 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.425747 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.426032 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.426187 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.441290 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-txgjt"] Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.557579 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.557645 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.557724 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.557756 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g76xv\" (UniqueName: \"kubernetes.io/projected/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-kube-api-access-g76xv\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.557810 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-sasl-config\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.557856 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.557893 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-sasl-users\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.659245 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.659636 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.659755 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.659874 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g76xv\" (UniqueName: \"kubernetes.io/projected/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-kube-api-access-g76xv\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.659992 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-sasl-config\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.660100 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.660220 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-sasl-users\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.662188 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-sasl-config\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.669878 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.670443 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.672679 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.678956 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.679112 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-sasl-users\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.693584 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g76xv\" (UniqueName: \"kubernetes.io/projected/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-kube-api-access-g76xv\") pod \"default-interconnect-68864d46cb-txgjt\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.740644 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:31:49 crc kubenswrapper[4867]: I0320 00:31:49.956600 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-txgjt"] Mar 20 00:31:50 crc kubenswrapper[4867]: I0320 00:31:50.268779 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-txgjt" event={"ID":"780cb59f-bd17-4227-92b0-8f9a8ffbddfc","Type":"ContainerStarted","Data":"ce27f3a9633d78b2d52c61f93b843f300065265b6997529058096cd4e92d0b16"} Mar 20 00:31:56 crc kubenswrapper[4867]: I0320 00:31:56.311024 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-txgjt" event={"ID":"780cb59f-bd17-4227-92b0-8f9a8ffbddfc","Type":"ContainerStarted","Data":"5d60e91a67c17c63cf8f0f3682930a0bb1ba6d31565db97d6c0ba86b29c36346"} Mar 20 00:31:56 crc kubenswrapper[4867]: I0320 00:31:56.335640 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-txgjt" podStartSLOduration=2.098790128 podStartE2EDuration="7.335611426s" podCreationTimestamp="2026-03-20 00:31:49 +0000 UTC" firstStartedPulling="2026-03-20 00:31:49.968932994 +0000 UTC m=+1524.195470511" lastFinishedPulling="2026-03-20 00:31:55.205754282 +0000 UTC m=+1529.432291809" observedRunningTime="2026-03-20 00:31:56.327207968 +0000 UTC m=+1530.553745625" watchObservedRunningTime="2026-03-20 00:31:56.335611426 +0000 UTC m=+1530.562148973" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.461620 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.464120 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: W0320 00:31:59.465901 4867 reflector.go:561] object-"service-telemetry"/"default-session-secret": failed to list *v1.Secret: secrets "default-session-secret" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "service-telemetry": no relationship found between node 'crc' and this object Mar 20 00:31:59 crc kubenswrapper[4867]: E0320 00:31:59.466080 4867 reflector.go:158] "Unhandled Error" err="object-\"service-telemetry\"/\"default-session-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-session-secret\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"service-telemetry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 00:31:59 crc kubenswrapper[4867]: W0320 00:31:59.466145 4867 reflector.go:561] object-"service-telemetry"/"prometheus-stf-dockercfg-77j8b": failed to list *v1.Secret: secrets "prometheus-stf-dockercfg-77j8b" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "service-telemetry": no relationship found between node 'crc' and this object Mar 20 00:31:59 crc kubenswrapper[4867]: E0320 00:31:59.466292 4867 reflector.go:158] "Unhandled Error" err="object-\"service-telemetry\"/\"prometheus-stf-dockercfg-77j8b\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"prometheus-stf-dockercfg-77j8b\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"service-telemetry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 00:31:59 crc kubenswrapper[4867]: W0320 00:31:59.466182 4867 reflector.go:561] object-"service-telemetry"/"serving-certs-ca-bundle": failed to list *v1.ConfigMap: configmaps "serving-certs-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "service-telemetry": no relationship found between node 'crc' and this object Mar 20 00:31:59 crc kubenswrapper[4867]: E0320 00:31:59.466745 4867 reflector.go:158] "Unhandled Error" err="object-\"service-telemetry\"/\"serving-certs-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"serving-certs-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"service-telemetry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 00:31:59 crc kubenswrapper[4867]: W0320 00:31:59.466804 4867 reflector.go:561] object-"service-telemetry"/"prometheus-default-rulefiles-1": failed to list *v1.ConfigMap: configmaps "prometheus-default-rulefiles-1" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "service-telemetry": no relationship found between node 'crc' and this object Mar 20 00:31:59 crc kubenswrapper[4867]: E0320 00:31:59.466896 4867 reflector.go:158] "Unhandled Error" err="object-\"service-telemetry\"/\"prometheus-default-rulefiles-1\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"prometheus-default-rulefiles-1\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"service-telemetry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 00:31:59 crc kubenswrapper[4867]: W0320 00:31:59.466912 4867 reflector.go:561] object-"service-telemetry"/"prometheus-default-web-config": failed to list *v1.Secret: secrets "prometheus-default-web-config" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "service-telemetry": no relationship found between node 'crc' and this object Mar 20 00:31:59 crc kubenswrapper[4867]: E0320 00:31:59.467035 4867 reflector.go:158] "Unhandled Error" err="object-\"service-telemetry\"/\"prometheus-default-web-config\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"prometheus-default-web-config\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"service-telemetry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 00:31:59 crc kubenswrapper[4867]: W0320 00:31:59.466917 4867 reflector.go:561] object-"service-telemetry"/"prometheus-default-rulefiles-2": failed to list *v1.ConfigMap: configmaps "prometheus-default-rulefiles-2" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "service-telemetry": no relationship found between node 'crc' and this object Mar 20 00:31:59 crc kubenswrapper[4867]: E0320 00:31:59.467171 4867 reflector.go:158] "Unhandled Error" err="object-\"service-telemetry\"/\"prometheus-default-rulefiles-2\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"prometheus-default-rulefiles-2\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"service-telemetry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 00:31:59 crc kubenswrapper[4867]: W0320 00:31:59.467472 4867 reflector.go:561] object-"service-telemetry"/"prometheus-default-tls-assets-0": failed to list *v1.Secret: secrets "prometheus-default-tls-assets-0" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "service-telemetry": no relationship found between node 'crc' and this object Mar 20 00:31:59 crc kubenswrapper[4867]: E0320 00:31:59.467541 4867 reflector.go:158] "Unhandled Error" err="object-\"service-telemetry\"/\"prometheus-default-tls-assets-0\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"prometheus-default-tls-assets-0\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"service-telemetry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 00:31:59 crc kubenswrapper[4867]: W0320 00:31:59.467484 4867 reflector.go:561] object-"service-telemetry"/"prometheus-default-rulefiles-0": failed to list *v1.ConfigMap: configmaps "prometheus-default-rulefiles-0" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "service-telemetry": no relationship found between node 'crc' and this object Mar 20 00:31:59 crc kubenswrapper[4867]: E0320 00:31:59.467570 4867 reflector.go:158] "Unhandled Error" err="object-\"service-telemetry\"/\"prometheus-default-rulefiles-0\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"prometheus-default-rulefiles-0\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"service-telemetry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 00:31:59 crc kubenswrapper[4867]: W0320 00:31:59.467873 4867 reflector.go:561] object-"service-telemetry"/"prometheus-default": failed to list *v1.Secret: secrets "prometheus-default" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "service-telemetry": no relationship found between node 'crc' and this object Mar 20 00:31:59 crc kubenswrapper[4867]: E0320 00:31:59.467898 4867 reflector.go:158] "Unhandled Error" err="object-\"service-telemetry\"/\"prometheus-default\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"prometheus-default\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"service-telemetry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 00:31:59 crc kubenswrapper[4867]: W0320 00:31:59.468588 4867 reflector.go:561] object-"service-telemetry"/"default-prometheus-proxy-tls": failed to list *v1.Secret: secrets "default-prometheus-proxy-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "service-telemetry": no relationship found between node 'crc' and this object Mar 20 00:31:59 crc kubenswrapper[4867]: E0320 00:31:59.468609 4867 reflector.go:158] "Unhandled Error" err="object-\"service-telemetry\"/\"default-prometheus-proxy-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-prometheus-proxy-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"service-telemetry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.487355 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.601316 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cea5088-2867-4484-914c-70644f9eed47-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.601671 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4cea5088-2867-4484-914c-70644f9eed47-web-config\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.601766 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/4cea5088-2867-4484-914c-70644f9eed47-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.601852 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5d1f3843-e4b2-4514-b450-fbf26f4865a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d1f3843-e4b2-4514-b450-fbf26f4865a4\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.601940 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cea5088-2867-4484-914c-70644f9eed47-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.602026 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4cea5088-2867-4484-914c-70644f9eed47-config\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.602207 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4cea5088-2867-4484-914c-70644f9eed47-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.602310 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4cea5088-2867-4484-914c-70644f9eed47-tls-assets\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.602388 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4cea5088-2867-4484-914c-70644f9eed47-config-out\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.602473 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4cea5088-2867-4484-914c-70644f9eed47-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.602576 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wtqx\" (UniqueName: \"kubernetes.io/projected/4cea5088-2867-4484-914c-70644f9eed47-kube-api-access-7wtqx\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.602669 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4cea5088-2867-4484-914c-70644f9eed47-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.703655 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4cea5088-2867-4484-914c-70644f9eed47-web-config\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.703849 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/4cea5088-2867-4484-914c-70644f9eed47-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.704006 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5d1f3843-e4b2-4514-b450-fbf26f4865a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d1f3843-e4b2-4514-b450-fbf26f4865a4\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.704121 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cea5088-2867-4484-914c-70644f9eed47-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.704237 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4cea5088-2867-4484-914c-70644f9eed47-config\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.704345 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4cea5088-2867-4484-914c-70644f9eed47-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.704446 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4cea5088-2867-4484-914c-70644f9eed47-tls-assets\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.704544 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4cea5088-2867-4484-914c-70644f9eed47-config-out\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.704659 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4cea5088-2867-4484-914c-70644f9eed47-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.704751 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wtqx\" (UniqueName: \"kubernetes.io/projected/4cea5088-2867-4484-914c-70644f9eed47-kube-api-access-7wtqx\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.704832 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4cea5088-2867-4484-914c-70644f9eed47-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.704908 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cea5088-2867-4484-914c-70644f9eed47-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.707445 4867 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.707483 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5d1f3843-e4b2-4514-b450-fbf26f4865a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d1f3843-e4b2-4514-b450-fbf26f4865a4\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/21ffbc6f770c0063c07a4bd51101683158da5f80f7defb113504cf7a4e3df3af/globalmount\"" pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.719138 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4cea5088-2867-4484-914c-70644f9eed47-config-out\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.731412 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wtqx\" (UniqueName: \"kubernetes.io/projected/4cea5088-2867-4484-914c-70644f9eed47-kube-api-access-7wtqx\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:31:59 crc kubenswrapper[4867]: I0320 00:31:59.760585 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5d1f3843-e4b2-4514-b450-fbf26f4865a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d1f3843-e4b2-4514-b450-fbf26f4865a4\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.129130 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566112-xv5p6"] Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.130180 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566112-xv5p6" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.132225 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.133134 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lxzm5" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.133136 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.136778 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566112-xv5p6"] Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.211964 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdltq\" (UniqueName: \"kubernetes.io/projected/7a980d9d-47a1-4275-99e9-0677f0baff73-kube-api-access-fdltq\") pod \"auto-csr-approver-29566112-xv5p6\" (UID: \"7a980d9d-47a1-4275-99e9-0677f0baff73\") " pod="openshift-infra/auto-csr-approver-29566112-xv5p6" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.272324 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.278150 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4cea5088-2867-4484-914c-70644f9eed47-web-config\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.278976 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.285943 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cea5088-2867-4484-914c-70644f9eed47-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.313351 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdltq\" (UniqueName: \"kubernetes.io/projected/7a980d9d-47a1-4275-99e9-0677f0baff73-kube-api-access-fdltq\") pod \"auto-csr-approver-29566112-xv5p6\" (UID: \"7a980d9d-47a1-4275-99e9-0677f0baff73\") " pod="openshift-infra/auto-csr-approver-29566112-xv5p6" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.331809 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdltq\" (UniqueName: \"kubernetes.io/projected/7a980d9d-47a1-4275-99e9-0677f0baff73-kube-api-access-fdltq\") pod \"auto-csr-approver-29566112-xv5p6\" (UID: \"7a980d9d-47a1-4275-99e9-0677f0baff73\") " pod="openshift-infra/auto-csr-approver-29566112-xv5p6" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.438384 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.448562 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4cea5088-2867-4484-914c-70644f9eed47-tls-assets\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.457675 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566112-xv5p6" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.669479 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.673339 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.676929 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4cea5088-2867-4484-914c-70644f9eed47-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.681208 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cea5088-2867-4484-914c-70644f9eed47-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:32:00 crc kubenswrapper[4867]: E0320 00:32:00.703979 4867 secret.go:188] Couldn't get secret service-telemetry/default-session-secret: failed to sync secret cache: timed out waiting for the condition Mar 20 00:32:00 crc kubenswrapper[4867]: E0320 00:32:00.704094 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cea5088-2867-4484-914c-70644f9eed47-secret-default-session-secret podName:4cea5088-2867-4484-914c-70644f9eed47 nodeName:}" failed. No retries permitted until 2026-03-20 00:32:01.204064715 +0000 UTC m=+1535.430602232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-session-secret" (UniqueName: "kubernetes.io/secret/4cea5088-2867-4484-914c-70644f9eed47-secret-default-session-secret") pod "prometheus-default-0" (UID: "4cea5088-2867-4484-914c-70644f9eed47") : failed to sync secret cache: timed out waiting for the condition Mar 20 00:32:00 crc kubenswrapper[4867]: E0320 00:32:00.704867 4867 configmap.go:193] Couldn't get configMap service-telemetry/prometheus-default-rulefiles-2: failed to sync configmap cache: timed out waiting for the condition Mar 20 00:32:00 crc kubenswrapper[4867]: E0320 00:32:00.704913 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4cea5088-2867-4484-914c-70644f9eed47-prometheus-default-rulefiles-2 podName:4cea5088-2867-4484-914c-70644f9eed47 nodeName:}" failed. No retries permitted until 2026-03-20 00:32:01.204898897 +0000 UTC m=+1535.431436414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-default-rulefiles-2" (UniqueName: "kubernetes.io/configmap/4cea5088-2867-4484-914c-70644f9eed47-prometheus-default-rulefiles-2") pod "prometheus-default-0" (UID: "4cea5088-2867-4484-914c-70644f9eed47") : failed to sync configmap cache: timed out waiting for the condition Mar 20 00:32:00 crc kubenswrapper[4867]: E0320 00:32:00.704949 4867 secret.go:188] Couldn't get secret service-telemetry/prometheus-default: failed to sync secret cache: timed out waiting for the condition Mar 20 00:32:00 crc kubenswrapper[4867]: E0320 00:32:00.704985 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cea5088-2867-4484-914c-70644f9eed47-config podName:4cea5088-2867-4484-914c-70644f9eed47 nodeName:}" failed. No retries permitted until 2026-03-20 00:32:01.204973039 +0000 UTC m=+1535.431510556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/4cea5088-2867-4484-914c-70644f9eed47-config") pod "prometheus-default-0" (UID: "4cea5088-2867-4484-914c-70644f9eed47") : failed to sync secret cache: timed out waiting for the condition Mar 20 00:32:00 crc kubenswrapper[4867]: E0320 00:32:00.705008 4867 configmap.go:193] Couldn't get configMap service-telemetry/prometheus-default-rulefiles-1: failed to sync configmap cache: timed out waiting for the condition Mar 20 00:32:00 crc kubenswrapper[4867]: E0320 00:32:00.705036 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4cea5088-2867-4484-914c-70644f9eed47-prometheus-default-rulefiles-1 podName:4cea5088-2867-4484-914c-70644f9eed47 nodeName:}" failed. No retries permitted until 2026-03-20 00:32:01.20502747 +0000 UTC m=+1535.431564987 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-default-rulefiles-1" (UniqueName: "kubernetes.io/configmap/4cea5088-2867-4484-914c-70644f9eed47-prometheus-default-rulefiles-1") pod "prometheus-default-0" (UID: "4cea5088-2867-4484-914c-70644f9eed47") : failed to sync configmap cache: timed out waiting for the condition Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.745518 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.757124 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.834659 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.904352 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.923910 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-77j8b" Mar 20 00:32:00 crc kubenswrapper[4867]: I0320 00:32:00.934640 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566112-xv5p6"] Mar 20 00:32:01 crc kubenswrapper[4867]: I0320 00:32:01.235165 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4cea5088-2867-4484-914c-70644f9eed47-config\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:32:01 crc kubenswrapper[4867]: I0320 00:32:01.235262 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4cea5088-2867-4484-914c-70644f9eed47-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:32:01 crc kubenswrapper[4867]: I0320 00:32:01.235336 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4cea5088-2867-4484-914c-70644f9eed47-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:32:01 crc kubenswrapper[4867]: I0320 00:32:01.235377 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/4cea5088-2867-4484-914c-70644f9eed47-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:32:01 crc kubenswrapper[4867]: I0320 00:32:01.236298 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4cea5088-2867-4484-914c-70644f9eed47-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:32:01 crc kubenswrapper[4867]: I0320 00:32:01.236368 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4cea5088-2867-4484-914c-70644f9eed47-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:32:01 crc kubenswrapper[4867]: I0320 00:32:01.240882 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4cea5088-2867-4484-914c-70644f9eed47-config\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:32:01 crc kubenswrapper[4867]: I0320 00:32:01.241263 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/4cea5088-2867-4484-914c-70644f9eed47-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"4cea5088-2867-4484-914c-70644f9eed47\") " pod="service-telemetry/prometheus-default-0" Mar 20 00:32:01 crc kubenswrapper[4867]: I0320 00:32:01.345165 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566112-xv5p6" event={"ID":"7a980d9d-47a1-4275-99e9-0677f0baff73","Type":"ContainerStarted","Data":"ed765246a82d1aa69c9c6c4948a1117731cc297cd08e8e76bed959d8e8ef02c0"} Mar 20 00:32:01 crc kubenswrapper[4867]: I0320 00:32:01.346544 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 20 00:32:01 crc kubenswrapper[4867]: I0320 00:32:01.538859 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 20 00:32:01 crc kubenswrapper[4867]: W0320 00:32:01.547371 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cea5088_2867_4484_914c_70644f9eed47.slice/crio-cf279c9db7dee466fd9a265744865ecbe3ad845909cbc9369b97bcd248ae1c32 WatchSource:0}: Error finding container cf279c9db7dee466fd9a265744865ecbe3ad845909cbc9369b97bcd248ae1c32: Status 404 returned error can't find the container with id cf279c9db7dee466fd9a265744865ecbe3ad845909cbc9369b97bcd248ae1c32 Mar 20 00:32:02 crc kubenswrapper[4867]: I0320 00:32:02.352736 4867 generic.go:334] "Generic (PLEG): container finished" podID="7a980d9d-47a1-4275-99e9-0677f0baff73" containerID="e0f9c774f98d8a4bea376fa9477732776c21be04db0a52f63c4a121fdd47a6fa" exitCode=0 Mar 20 00:32:02 crc kubenswrapper[4867]: I0320 00:32:02.352802 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566112-xv5p6" event={"ID":"7a980d9d-47a1-4275-99e9-0677f0baff73","Type":"ContainerDied","Data":"e0f9c774f98d8a4bea376fa9477732776c21be04db0a52f63c4a121fdd47a6fa"} Mar 20 00:32:02 crc kubenswrapper[4867]: I0320 00:32:02.354295 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"4cea5088-2867-4484-914c-70644f9eed47","Type":"ContainerStarted","Data":"cf279c9db7dee466fd9a265744865ecbe3ad845909cbc9369b97bcd248ae1c32"} Mar 20 00:32:03 crc kubenswrapper[4867]: I0320 00:32:03.641147 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566112-xv5p6" Mar 20 00:32:03 crc kubenswrapper[4867]: I0320 00:32:03.770149 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdltq\" (UniqueName: \"kubernetes.io/projected/7a980d9d-47a1-4275-99e9-0677f0baff73-kube-api-access-fdltq\") pod \"7a980d9d-47a1-4275-99e9-0677f0baff73\" (UID: \"7a980d9d-47a1-4275-99e9-0677f0baff73\") " Mar 20 00:32:03 crc kubenswrapper[4867]: I0320 00:32:03.776077 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a980d9d-47a1-4275-99e9-0677f0baff73-kube-api-access-fdltq" (OuterVolumeSpecName: "kube-api-access-fdltq") pod "7a980d9d-47a1-4275-99e9-0677f0baff73" (UID: "7a980d9d-47a1-4275-99e9-0677f0baff73"). InnerVolumeSpecName "kube-api-access-fdltq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:32:03 crc kubenswrapper[4867]: I0320 00:32:03.872455 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdltq\" (UniqueName: \"kubernetes.io/projected/7a980d9d-47a1-4275-99e9-0677f0baff73-kube-api-access-fdltq\") on node \"crc\" DevicePath \"\"" Mar 20 00:32:04 crc kubenswrapper[4867]: I0320 00:32:04.371285 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566112-xv5p6" event={"ID":"7a980d9d-47a1-4275-99e9-0677f0baff73","Type":"ContainerDied","Data":"ed765246a82d1aa69c9c6c4948a1117731cc297cd08e8e76bed959d8e8ef02c0"} Mar 20 00:32:04 crc kubenswrapper[4867]: I0320 00:32:04.371320 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed765246a82d1aa69c9c6c4948a1117731cc297cd08e8e76bed959d8e8ef02c0" Mar 20 00:32:04 crc kubenswrapper[4867]: I0320 00:32:04.371403 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566112-xv5p6" Mar 20 00:32:04 crc kubenswrapper[4867]: I0320 00:32:04.699371 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566106-bxrm9"] Mar 20 00:32:04 crc kubenswrapper[4867]: I0320 00:32:04.705083 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566106-bxrm9"] Mar 20 00:32:05 crc kubenswrapper[4867]: I0320 00:32:05.381723 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"4cea5088-2867-4484-914c-70644f9eed47","Type":"ContainerStarted","Data":"3bf0ef3b2109d6cea5bc485fd4b3f5e62fe736a0ba835df1f64f10c258fd72fb"} Mar 20 00:32:06 crc kubenswrapper[4867]: I0320 00:32:06.435403 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c2dc34e-ac70-4be1-9a54-475a773fa80e" path="/var/lib/kubelet/pods/3c2dc34e-ac70-4be1-9a54-475a773fa80e/volumes" Mar 20 00:32:08 crc kubenswrapper[4867]: I0320 00:32:08.811107 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-lgfg8"] Mar 20 00:32:08 crc kubenswrapper[4867]: E0320 00:32:08.812743 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a980d9d-47a1-4275-99e9-0677f0baff73" containerName="oc" Mar 20 00:32:08 crc kubenswrapper[4867]: I0320 00:32:08.812868 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a980d9d-47a1-4275-99e9-0677f0baff73" containerName="oc" Mar 20 00:32:08 crc kubenswrapper[4867]: I0320 00:32:08.813059 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a980d9d-47a1-4275-99e9-0677f0baff73" containerName="oc" Mar 20 00:32:08 crc kubenswrapper[4867]: I0320 00:32:08.813696 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-lgfg8" Mar 20 00:32:08 crc kubenswrapper[4867]: I0320 00:32:08.820733 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-lgfg8"] Mar 20 00:32:08 crc kubenswrapper[4867]: I0320 00:32:08.840099 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4wdp\" (UniqueName: \"kubernetes.io/projected/18484175-1344-4a70-8e0f-60b1adaecae8-kube-api-access-r4wdp\") pod \"default-snmp-webhook-6856cfb745-lgfg8\" (UID: \"18484175-1344-4a70-8e0f-60b1adaecae8\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-lgfg8" Mar 20 00:32:08 crc kubenswrapper[4867]: I0320 00:32:08.941660 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4wdp\" (UniqueName: \"kubernetes.io/projected/18484175-1344-4a70-8e0f-60b1adaecae8-kube-api-access-r4wdp\") pod \"default-snmp-webhook-6856cfb745-lgfg8\" (UID: \"18484175-1344-4a70-8e0f-60b1adaecae8\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-lgfg8" Mar 20 00:32:08 crc kubenswrapper[4867]: I0320 00:32:08.969568 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4wdp\" (UniqueName: \"kubernetes.io/projected/18484175-1344-4a70-8e0f-60b1adaecae8-kube-api-access-r4wdp\") pod \"default-snmp-webhook-6856cfb745-lgfg8\" (UID: \"18484175-1344-4a70-8e0f-60b1adaecae8\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-lgfg8" Mar 20 00:32:09 crc kubenswrapper[4867]: I0320 00:32:09.129771 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-lgfg8" Mar 20 00:32:09 crc kubenswrapper[4867]: I0320 00:32:09.570382 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-lgfg8"] Mar 20 00:32:10 crc kubenswrapper[4867]: I0320 00:32:10.435053 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-lgfg8" event={"ID":"18484175-1344-4a70-8e0f-60b1adaecae8","Type":"ContainerStarted","Data":"32869108f459b49d8d5300da58509a1768276ef8665fb636190e70650ad2a101"} Mar 20 00:32:11 crc kubenswrapper[4867]: I0320 00:32:11.451147 4867 generic.go:334] "Generic (PLEG): container finished" podID="4cea5088-2867-4484-914c-70644f9eed47" containerID="3bf0ef3b2109d6cea5bc485fd4b3f5e62fe736a0ba835df1f64f10c258fd72fb" exitCode=0 Mar 20 00:32:11 crc kubenswrapper[4867]: I0320 00:32:11.451223 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"4cea5088-2867-4484-914c-70644f9eed47","Type":"ContainerDied","Data":"3bf0ef3b2109d6cea5bc485fd4b3f5e62fe736a0ba835df1f64f10c258fd72fb"} Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.059810 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.061666 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.065424 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.066257 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.066462 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.066521 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.067167 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-5z2pv" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.067226 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.067337 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.246408 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e4bf26aa-cade-46b9-a126-3c1e02d5710d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e4bf26aa-cade-46b9-a126-3c1e02d5710d\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.246729 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a53d90d8-2c87-4869-ad34-dc5db259425d-config-volume\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.246759 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a53d90d8-2c87-4869-ad34-dc5db259425d-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.246783 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a53d90d8-2c87-4869-ad34-dc5db259425d-tls-assets\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.246801 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p76g\" (UniqueName: \"kubernetes.io/projected/a53d90d8-2c87-4869-ad34-dc5db259425d-kube-api-access-8p76g\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.246823 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a53d90d8-2c87-4869-ad34-dc5db259425d-web-config\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.246838 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a53d90d8-2c87-4869-ad34-dc5db259425d-config-out\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.246854 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/a53d90d8-2c87-4869-ad34-dc5db259425d-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.247001 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/a53d90d8-2c87-4869-ad34-dc5db259425d-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.348666 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/a53d90d8-2c87-4869-ad34-dc5db259425d-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.348734 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e4bf26aa-cade-46b9-a126-3c1e02d5710d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e4bf26aa-cade-46b9-a126-3c1e02d5710d\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.348766 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a53d90d8-2c87-4869-ad34-dc5db259425d-config-volume\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.348794 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a53d90d8-2c87-4869-ad34-dc5db259425d-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.348819 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a53d90d8-2c87-4869-ad34-dc5db259425d-tls-assets\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.348836 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p76g\" (UniqueName: \"kubernetes.io/projected/a53d90d8-2c87-4869-ad34-dc5db259425d-kube-api-access-8p76g\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.348858 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a53d90d8-2c87-4869-ad34-dc5db259425d-web-config\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.348883 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a53d90d8-2c87-4869-ad34-dc5db259425d-config-out\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.348908 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/a53d90d8-2c87-4869-ad34-dc5db259425d-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.352999 4867 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.353043 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e4bf26aa-cade-46b9-a126-3c1e02d5710d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e4bf26aa-cade-46b9-a126-3c1e02d5710d\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/225f3ae83155e4795bb4f51cea02142dc87aa0ad6553171fa11fe9c0a0488d61/globalmount\"" pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.355036 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a53d90d8-2c87-4869-ad34-dc5db259425d-config-out\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.355342 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/a53d90d8-2c87-4869-ad34-dc5db259425d-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.355540 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a53d90d8-2c87-4869-ad34-dc5db259425d-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.355685 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/a53d90d8-2c87-4869-ad34-dc5db259425d-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.362036 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a53d90d8-2c87-4869-ad34-dc5db259425d-tls-assets\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.362632 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a53d90d8-2c87-4869-ad34-dc5db259425d-config-volume\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.363885 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a53d90d8-2c87-4869-ad34-dc5db259425d-web-config\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.367967 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p76g\" (UniqueName: \"kubernetes.io/projected/a53d90d8-2c87-4869-ad34-dc5db259425d-kube-api-access-8p76g\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.379703 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e4bf26aa-cade-46b9-a126-3c1e02d5710d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e4bf26aa-cade-46b9-a126-3c1e02d5710d\") pod \"alertmanager-default-0\" (UID: \"a53d90d8-2c87-4869-ad34-dc5db259425d\") " pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:15 crc kubenswrapper[4867]: I0320 00:32:15.388306 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 20 00:32:17 crc kubenswrapper[4867]: I0320 00:32:17.209979 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 20 00:32:17 crc kubenswrapper[4867]: I0320 00:32:17.490271 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"a53d90d8-2c87-4869-ad34-dc5db259425d","Type":"ContainerStarted","Data":"f935efa6be5a8a6f1e672d90a61a339f214758ae318b15bd38c1a671e64b05d1"} Mar 20 00:32:17 crc kubenswrapper[4867]: I0320 00:32:17.491863 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-lgfg8" event={"ID":"18484175-1344-4a70-8e0f-60b1adaecae8","Type":"ContainerStarted","Data":"d18a0e4f49a85c6798e910daa68262caaa102f83f5cc05cf322310ed1049a32a"} Mar 20 00:32:17 crc kubenswrapper[4867]: I0320 00:32:17.506640 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-lgfg8" podStartSLOduration=1.778687794 podStartE2EDuration="9.506602822s" podCreationTimestamp="2026-03-20 00:32:08 +0000 UTC" firstStartedPulling="2026-03-20 00:32:09.577758321 +0000 UTC m=+1543.804295878" lastFinishedPulling="2026-03-20 00:32:17.305673389 +0000 UTC m=+1551.532210906" observedRunningTime="2026-03-20 00:32:17.5045926 +0000 UTC m=+1551.731130127" watchObservedRunningTime="2026-03-20 00:32:17.506602822 +0000 UTC m=+1551.733140339" Mar 20 00:32:18 crc kubenswrapper[4867]: I0320 00:32:18.860278 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:32:18 crc kubenswrapper[4867]: I0320 00:32:18.860677 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:32:18 crc kubenswrapper[4867]: I0320 00:32:18.860728 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:32:18 crc kubenswrapper[4867]: I0320 00:32:18.861342 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"533398062bfbfbe1fa6b73d9da3e04fbe63602429f4dc45fdc1ed10c2dc75279"} pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 00:32:18 crc kubenswrapper[4867]: I0320 00:32:18.861395 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" containerID="cri-o://533398062bfbfbe1fa6b73d9da3e04fbe63602429f4dc45fdc1ed10c2dc75279" gracePeriod=600 Mar 20 00:32:19 crc kubenswrapper[4867]: I0320 00:32:19.514061 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"a53d90d8-2c87-4869-ad34-dc5db259425d","Type":"ContainerStarted","Data":"fc8d273d595049fa0b3b8fcb4218fcbcf21587951b248835b6a03ec36b22d50c"} Mar 20 00:32:19 crc kubenswrapper[4867]: I0320 00:32:19.517108 4867 generic.go:334] "Generic (PLEG): container finished" podID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerID="533398062bfbfbe1fa6b73d9da3e04fbe63602429f4dc45fdc1ed10c2dc75279" exitCode=0 Mar 20 00:32:19 crc kubenswrapper[4867]: I0320 00:32:19.517168 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" event={"ID":"00eacbd3-d921-414b-8b8d-c4298bdd5a28","Type":"ContainerDied","Data":"533398062bfbfbe1fa6b73d9da3e04fbe63602429f4dc45fdc1ed10c2dc75279"} Mar 20 00:32:19 crc kubenswrapper[4867]: I0320 00:32:19.517197 4867 scope.go:117] "RemoveContainer" containerID="2d4ae32207d952a9c4b195a19d9b88321c105b63860ad160eeabe54e733e476b" Mar 20 00:32:24 crc kubenswrapper[4867]: I0320 00:32:24.549084 4867 generic.go:334] "Generic (PLEG): container finished" podID="a53d90d8-2c87-4869-ad34-dc5db259425d" containerID="fc8d273d595049fa0b3b8fcb4218fcbcf21587951b248835b6a03ec36b22d50c" exitCode=0 Mar 20 00:32:24 crc kubenswrapper[4867]: I0320 00:32:24.549203 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"a53d90d8-2c87-4869-ad34-dc5db259425d","Type":"ContainerDied","Data":"fc8d273d595049fa0b3b8fcb4218fcbcf21587951b248835b6a03ec36b22d50c"} Mar 20 00:32:24 crc kubenswrapper[4867]: I0320 00:32:24.552843 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" event={"ID":"00eacbd3-d921-414b-8b8d-c4298bdd5a28","Type":"ContainerStarted","Data":"8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5"} Mar 20 00:32:24 crc kubenswrapper[4867]: I0320 00:32:24.555309 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"4cea5088-2867-4484-914c-70644f9eed47","Type":"ContainerStarted","Data":"93f28ba20c29114cca9a7e483106491385663f203f91fed930c5e8164caab9d6"} Mar 20 00:32:24 crc kubenswrapper[4867]: I0320 00:32:24.890085 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m"] Mar 20 00:32:24 crc kubenswrapper[4867]: I0320 00:32:24.892061 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:24 crc kubenswrapper[4867]: I0320 00:32:24.897115 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Mar 20 00:32:24 crc kubenswrapper[4867]: I0320 00:32:24.897274 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Mar 20 00:32:24 crc kubenswrapper[4867]: I0320 00:32:24.897587 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-9nrws" Mar 20 00:32:24 crc kubenswrapper[4867]: I0320 00:32:24.898658 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Mar 20 00:32:24 crc kubenswrapper[4867]: I0320 00:32:24.923800 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m"] Mar 20 00:32:24 crc kubenswrapper[4867]: I0320 00:32:24.986504 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/51581aac-eba3-4506-8565-9085cc1a2ab0-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m\" (UID: \"51581aac-eba3-4506-8565-9085cc1a2ab0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:24 crc kubenswrapper[4867]: I0320 00:32:24.986573 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/51581aac-eba3-4506-8565-9085cc1a2ab0-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m\" (UID: \"51581aac-eba3-4506-8565-9085cc1a2ab0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:24 crc kubenswrapper[4867]: I0320 00:32:24.986629 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz84h\" (UniqueName: \"kubernetes.io/projected/51581aac-eba3-4506-8565-9085cc1a2ab0-kube-api-access-hz84h\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m\" (UID: \"51581aac-eba3-4506-8565-9085cc1a2ab0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:24 crc kubenswrapper[4867]: I0320 00:32:24.986664 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/51581aac-eba3-4506-8565-9085cc1a2ab0-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m\" (UID: \"51581aac-eba3-4506-8565-9085cc1a2ab0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:24 crc kubenswrapper[4867]: I0320 00:32:24.986720 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/51581aac-eba3-4506-8565-9085cc1a2ab0-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m\" (UID: \"51581aac-eba3-4506-8565-9085cc1a2ab0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:25 crc kubenswrapper[4867]: I0320 00:32:25.088482 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/51581aac-eba3-4506-8565-9085cc1a2ab0-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m\" (UID: \"51581aac-eba3-4506-8565-9085cc1a2ab0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:25 crc kubenswrapper[4867]: I0320 00:32:25.088578 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/51581aac-eba3-4506-8565-9085cc1a2ab0-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m\" (UID: \"51581aac-eba3-4506-8565-9085cc1a2ab0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:25 crc kubenswrapper[4867]: I0320 00:32:25.088605 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz84h\" (UniqueName: \"kubernetes.io/projected/51581aac-eba3-4506-8565-9085cc1a2ab0-kube-api-access-hz84h\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m\" (UID: \"51581aac-eba3-4506-8565-9085cc1a2ab0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:25 crc kubenswrapper[4867]: I0320 00:32:25.088652 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/51581aac-eba3-4506-8565-9085cc1a2ab0-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m\" (UID: \"51581aac-eba3-4506-8565-9085cc1a2ab0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:25 crc kubenswrapper[4867]: E0320 00:32:25.088688 4867 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 20 00:32:25 crc kubenswrapper[4867]: I0320 00:32:25.088757 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/51581aac-eba3-4506-8565-9085cc1a2ab0-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m\" (UID: \"51581aac-eba3-4506-8565-9085cc1a2ab0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:25 crc kubenswrapper[4867]: E0320 00:32:25.088779 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51581aac-eba3-4506-8565-9085cc1a2ab0-default-cloud1-coll-meter-proxy-tls podName:51581aac-eba3-4506-8565-9085cc1a2ab0 nodeName:}" failed. No retries permitted until 2026-03-20 00:32:25.58875015 +0000 UTC m=+1559.815287667 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/51581aac-eba3-4506-8565-9085cc1a2ab0-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" (UID: "51581aac-eba3-4506-8565-9085cc1a2ab0") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 20 00:32:25 crc kubenswrapper[4867]: I0320 00:32:25.089349 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/51581aac-eba3-4506-8565-9085cc1a2ab0-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m\" (UID: \"51581aac-eba3-4506-8565-9085cc1a2ab0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:25 crc kubenswrapper[4867]: I0320 00:32:25.089829 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/51581aac-eba3-4506-8565-9085cc1a2ab0-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m\" (UID: \"51581aac-eba3-4506-8565-9085cc1a2ab0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:25 crc kubenswrapper[4867]: I0320 00:32:25.097091 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/51581aac-eba3-4506-8565-9085cc1a2ab0-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m\" (UID: \"51581aac-eba3-4506-8565-9085cc1a2ab0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:25 crc kubenswrapper[4867]: I0320 00:32:25.107979 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz84h\" (UniqueName: \"kubernetes.io/projected/51581aac-eba3-4506-8565-9085cc1a2ab0-kube-api-access-hz84h\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m\" (UID: \"51581aac-eba3-4506-8565-9085cc1a2ab0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:25 crc kubenswrapper[4867]: I0320 00:32:25.595469 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/51581aac-eba3-4506-8565-9085cc1a2ab0-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m\" (UID: \"51581aac-eba3-4506-8565-9085cc1a2ab0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:25 crc kubenswrapper[4867]: E0320 00:32:25.595683 4867 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 20 00:32:25 crc kubenswrapper[4867]: E0320 00:32:25.596001 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51581aac-eba3-4506-8565-9085cc1a2ab0-default-cloud1-coll-meter-proxy-tls podName:51581aac-eba3-4506-8565-9085cc1a2ab0 nodeName:}" failed. No retries permitted until 2026-03-20 00:32:26.595957959 +0000 UTC m=+1560.822495476 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/51581aac-eba3-4506-8565-9085cc1a2ab0-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" (UID: "51581aac-eba3-4506-8565-9085cc1a2ab0") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 20 00:32:26 crc kubenswrapper[4867]: I0320 00:32:26.589849 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"4cea5088-2867-4484-914c-70644f9eed47","Type":"ContainerStarted","Data":"74f3f3942da3c28a5aea4c437e1421b3af300893913890308fc27f5e8fa05a5e"} Mar 20 00:32:26 crc kubenswrapper[4867]: I0320 00:32:26.607034 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/51581aac-eba3-4506-8565-9085cc1a2ab0-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m\" (UID: \"51581aac-eba3-4506-8565-9085cc1a2ab0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:26 crc kubenswrapper[4867]: I0320 00:32:26.612920 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/51581aac-eba3-4506-8565-9085cc1a2ab0-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m\" (UID: \"51581aac-eba3-4506-8565-9085cc1a2ab0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:26 crc kubenswrapper[4867]: I0320 00:32:26.713191 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-9nrws" Mar 20 00:32:26 crc kubenswrapper[4867]: I0320 00:32:26.722184 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" Mar 20 00:32:27 crc kubenswrapper[4867]: I0320 00:32:27.335390 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m"] Mar 20 00:32:27 crc kubenswrapper[4867]: W0320 00:32:27.345752 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51581aac_eba3_4506_8565_9085cc1a2ab0.slice/crio-22234dbe5ed4009cfe319cc1b8c14d0a8a5890d356c4c9c0b1fbd689a7aadbbf WatchSource:0}: Error finding container 22234dbe5ed4009cfe319cc1b8c14d0a8a5890d356c4c9c0b1fbd689a7aadbbf: Status 404 returned error can't find the container with id 22234dbe5ed4009cfe319cc1b8c14d0a8a5890d356c4c9c0b1fbd689a7aadbbf Mar 20 00:32:27 crc kubenswrapper[4867]: I0320 00:32:27.597940 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" event={"ID":"51581aac-eba3-4506-8565-9085cc1a2ab0","Type":"ContainerStarted","Data":"22234dbe5ed4009cfe319cc1b8c14d0a8a5890d356c4c9c0b1fbd689a7aadbbf"} Mar 20 00:32:27 crc kubenswrapper[4867]: I0320 00:32:27.600305 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"a53d90d8-2c87-4869-ad34-dc5db259425d","Type":"ContainerStarted","Data":"514ab32b2a88aa40532bd6a18a7bfc95b05c729c3b1746b808747d07de550896"} Mar 20 00:32:27 crc kubenswrapper[4867]: I0320 00:32:27.857217 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w"] Mar 20 00:32:27 crc kubenswrapper[4867]: I0320 00:32:27.858925 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:27 crc kubenswrapper[4867]: I0320 00:32:27.861775 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Mar 20 00:32:27 crc kubenswrapper[4867]: I0320 00:32:27.863679 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Mar 20 00:32:27 crc kubenswrapper[4867]: I0320 00:32:27.867628 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w"] Mar 20 00:32:27 crc kubenswrapper[4867]: I0320 00:32:27.929998 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/61034c06-0be0-421a-86fe-88414af211f7-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w\" (UID: \"61034c06-0be0-421a-86fe-88414af211f7\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:27 crc kubenswrapper[4867]: I0320 00:32:27.930066 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/61034c06-0be0-421a-86fe-88414af211f7-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w\" (UID: \"61034c06-0be0-421a-86fe-88414af211f7\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:27 crc kubenswrapper[4867]: I0320 00:32:27.930190 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/61034c06-0be0-421a-86fe-88414af211f7-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w\" (UID: \"61034c06-0be0-421a-86fe-88414af211f7\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:27 crc kubenswrapper[4867]: I0320 00:32:27.930335 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/61034c06-0be0-421a-86fe-88414af211f7-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w\" (UID: \"61034c06-0be0-421a-86fe-88414af211f7\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:27 crc kubenswrapper[4867]: I0320 00:32:27.930414 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr9lj\" (UniqueName: \"kubernetes.io/projected/61034c06-0be0-421a-86fe-88414af211f7-kube-api-access-gr9lj\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w\" (UID: \"61034c06-0be0-421a-86fe-88414af211f7\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:28 crc kubenswrapper[4867]: I0320 00:32:28.031839 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/61034c06-0be0-421a-86fe-88414af211f7-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w\" (UID: \"61034c06-0be0-421a-86fe-88414af211f7\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:28 crc kubenswrapper[4867]: I0320 00:32:28.032206 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/61034c06-0be0-421a-86fe-88414af211f7-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w\" (UID: \"61034c06-0be0-421a-86fe-88414af211f7\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:28 crc kubenswrapper[4867]: I0320 00:32:28.032228 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/61034c06-0be0-421a-86fe-88414af211f7-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w\" (UID: \"61034c06-0be0-421a-86fe-88414af211f7\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:28 crc kubenswrapper[4867]: I0320 00:32:28.032266 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/61034c06-0be0-421a-86fe-88414af211f7-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w\" (UID: \"61034c06-0be0-421a-86fe-88414af211f7\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:28 crc kubenswrapper[4867]: I0320 00:32:28.032314 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr9lj\" (UniqueName: \"kubernetes.io/projected/61034c06-0be0-421a-86fe-88414af211f7-kube-api-access-gr9lj\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w\" (UID: \"61034c06-0be0-421a-86fe-88414af211f7\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:28 crc kubenswrapper[4867]: E0320 00:32:28.032656 4867 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 20 00:32:28 crc kubenswrapper[4867]: E0320 00:32:28.032736 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61034c06-0be0-421a-86fe-88414af211f7-default-cloud1-ceil-meter-proxy-tls podName:61034c06-0be0-421a-86fe-88414af211f7 nodeName:}" failed. No retries permitted until 2026-03-20 00:32:28.53271679 +0000 UTC m=+1562.759254307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/61034c06-0be0-421a-86fe-88414af211f7-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" (UID: "61034c06-0be0-421a-86fe-88414af211f7") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 20 00:32:28 crc kubenswrapper[4867]: I0320 00:32:28.033256 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/61034c06-0be0-421a-86fe-88414af211f7-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w\" (UID: \"61034c06-0be0-421a-86fe-88414af211f7\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:28 crc kubenswrapper[4867]: I0320 00:32:28.033516 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/61034c06-0be0-421a-86fe-88414af211f7-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w\" (UID: \"61034c06-0be0-421a-86fe-88414af211f7\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:28 crc kubenswrapper[4867]: I0320 00:32:28.044169 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/61034c06-0be0-421a-86fe-88414af211f7-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w\" (UID: \"61034c06-0be0-421a-86fe-88414af211f7\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:28 crc kubenswrapper[4867]: I0320 00:32:28.049923 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr9lj\" (UniqueName: \"kubernetes.io/projected/61034c06-0be0-421a-86fe-88414af211f7-kube-api-access-gr9lj\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w\" (UID: \"61034c06-0be0-421a-86fe-88414af211f7\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:28 crc kubenswrapper[4867]: I0320 00:32:28.537064 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/61034c06-0be0-421a-86fe-88414af211f7-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w\" (UID: \"61034c06-0be0-421a-86fe-88414af211f7\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:28 crc kubenswrapper[4867]: E0320 00:32:28.537332 4867 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 20 00:32:28 crc kubenswrapper[4867]: E0320 00:32:28.537418 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61034c06-0be0-421a-86fe-88414af211f7-default-cloud1-ceil-meter-proxy-tls podName:61034c06-0be0-421a-86fe-88414af211f7 nodeName:}" failed. No retries permitted until 2026-03-20 00:32:29.537399434 +0000 UTC m=+1563.763936951 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/61034c06-0be0-421a-86fe-88414af211f7-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" (UID: "61034c06-0be0-421a-86fe-88414af211f7") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 20 00:32:29 crc kubenswrapper[4867]: I0320 00:32:29.549474 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/61034c06-0be0-421a-86fe-88414af211f7-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w\" (UID: \"61034c06-0be0-421a-86fe-88414af211f7\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:29 crc kubenswrapper[4867]: I0320 00:32:29.569348 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/61034c06-0be0-421a-86fe-88414af211f7-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w\" (UID: \"61034c06-0be0-421a-86fe-88414af211f7\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:29 crc kubenswrapper[4867]: I0320 00:32:29.617155 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"a53d90d8-2c87-4869-ad34-dc5db259425d","Type":"ContainerStarted","Data":"28056ed880819be4e0b8b270463eee15853051364bc044fac4026564c44065bc"} Mar 20 00:32:29 crc kubenswrapper[4867]: I0320 00:32:29.674848 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" Mar 20 00:32:31 crc kubenswrapper[4867]: I0320 00:32:31.763177 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6"] Mar 20 00:32:31 crc kubenswrapper[4867]: I0320 00:32:31.764753 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:31 crc kubenswrapper[4867]: I0320 00:32:31.767078 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Mar 20 00:32:31 crc kubenswrapper[4867]: I0320 00:32:31.767425 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Mar 20 00:32:31 crc kubenswrapper[4867]: I0320 00:32:31.775458 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6"] Mar 20 00:32:31 crc kubenswrapper[4867]: I0320 00:32:31.889456 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c011887-4b7c-450a-8f02-40c60be82359-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6\" (UID: \"7c011887-4b7c-450a-8f02-40c60be82359\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:31 crc kubenswrapper[4867]: I0320 00:32:31.889548 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h8t7\" (UniqueName: \"kubernetes.io/projected/7c011887-4b7c-450a-8f02-40c60be82359-kube-api-access-2h8t7\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6\" (UID: \"7c011887-4b7c-450a-8f02-40c60be82359\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:31 crc kubenswrapper[4867]: I0320 00:32:31.889613 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/7c011887-4b7c-450a-8f02-40c60be82359-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6\" (UID: \"7c011887-4b7c-450a-8f02-40c60be82359\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:31 crc kubenswrapper[4867]: I0320 00:32:31.889661 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/7c011887-4b7c-450a-8f02-40c60be82359-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6\" (UID: \"7c011887-4b7c-450a-8f02-40c60be82359\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:31 crc kubenswrapper[4867]: I0320 00:32:31.889683 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/7c011887-4b7c-450a-8f02-40c60be82359-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6\" (UID: \"7c011887-4b7c-450a-8f02-40c60be82359\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:31 crc kubenswrapper[4867]: I0320 00:32:31.991992 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h8t7\" (UniqueName: \"kubernetes.io/projected/7c011887-4b7c-450a-8f02-40c60be82359-kube-api-access-2h8t7\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6\" (UID: \"7c011887-4b7c-450a-8f02-40c60be82359\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:31 crc kubenswrapper[4867]: I0320 00:32:31.992099 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/7c011887-4b7c-450a-8f02-40c60be82359-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6\" (UID: \"7c011887-4b7c-450a-8f02-40c60be82359\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:31 crc kubenswrapper[4867]: I0320 00:32:31.992149 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/7c011887-4b7c-450a-8f02-40c60be82359-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6\" (UID: \"7c011887-4b7c-450a-8f02-40c60be82359\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:31 crc kubenswrapper[4867]: I0320 00:32:31.992167 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/7c011887-4b7c-450a-8f02-40c60be82359-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6\" (UID: \"7c011887-4b7c-450a-8f02-40c60be82359\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:31 crc kubenswrapper[4867]: I0320 00:32:31.992216 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c011887-4b7c-450a-8f02-40c60be82359-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6\" (UID: \"7c011887-4b7c-450a-8f02-40c60be82359\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:31 crc kubenswrapper[4867]: E0320 00:32:31.992390 4867 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 20 00:32:31 crc kubenswrapper[4867]: E0320 00:32:31.992469 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c011887-4b7c-450a-8f02-40c60be82359-default-cloud1-sens-meter-proxy-tls podName:7c011887-4b7c-450a-8f02-40c60be82359 nodeName:}" failed. No retries permitted until 2026-03-20 00:32:32.492442765 +0000 UTC m=+1566.718980282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/7c011887-4b7c-450a-8f02-40c60be82359-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" (UID: "7c011887-4b7c-450a-8f02-40c60be82359") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 20 00:32:31 crc kubenswrapper[4867]: I0320 00:32:31.994747 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/7c011887-4b7c-450a-8f02-40c60be82359-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6\" (UID: \"7c011887-4b7c-450a-8f02-40c60be82359\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:31 crc kubenswrapper[4867]: I0320 00:32:31.995201 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/7c011887-4b7c-450a-8f02-40c60be82359-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6\" (UID: \"7c011887-4b7c-450a-8f02-40c60be82359\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:32 crc kubenswrapper[4867]: I0320 00:32:32.004647 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/7c011887-4b7c-450a-8f02-40c60be82359-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6\" (UID: \"7c011887-4b7c-450a-8f02-40c60be82359\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:32 crc kubenswrapper[4867]: I0320 00:32:32.011249 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h8t7\" (UniqueName: \"kubernetes.io/projected/7c011887-4b7c-450a-8f02-40c60be82359-kube-api-access-2h8t7\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6\" (UID: \"7c011887-4b7c-450a-8f02-40c60be82359\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:32 crc kubenswrapper[4867]: I0320 00:32:32.498542 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c011887-4b7c-450a-8f02-40c60be82359-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6\" (UID: \"7c011887-4b7c-450a-8f02-40c60be82359\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:32 crc kubenswrapper[4867]: E0320 00:32:32.498740 4867 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 20 00:32:32 crc kubenswrapper[4867]: E0320 00:32:32.498879 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c011887-4b7c-450a-8f02-40c60be82359-default-cloud1-sens-meter-proxy-tls podName:7c011887-4b7c-450a-8f02-40c60be82359 nodeName:}" failed. No retries permitted until 2026-03-20 00:32:33.498859044 +0000 UTC m=+1567.725396561 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/7c011887-4b7c-450a-8f02-40c60be82359-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" (UID: "7c011887-4b7c-450a-8f02-40c60be82359") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 20 00:32:33 crc kubenswrapper[4867]: I0320 00:32:33.511758 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c011887-4b7c-450a-8f02-40c60be82359-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6\" (UID: \"7c011887-4b7c-450a-8f02-40c60be82359\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:33 crc kubenswrapper[4867]: I0320 00:32:33.526010 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c011887-4b7c-450a-8f02-40c60be82359-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6\" (UID: \"7c011887-4b7c-450a-8f02-40c60be82359\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:33 crc kubenswrapper[4867]: I0320 00:32:33.595797 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" Mar 20 00:32:33 crc kubenswrapper[4867]: I0320 00:32:33.819052 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w"] Mar 20 00:32:34 crc kubenswrapper[4867]: I0320 00:32:34.004530 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6"] Mar 20 00:32:34 crc kubenswrapper[4867]: I0320 00:32:34.671630 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"4cea5088-2867-4484-914c-70644f9eed47","Type":"ContainerStarted","Data":"edb517be56765f87e8d7f74bc98a53f5b59cc7ff602a549be78e3b6e849461a7"} Mar 20 00:32:34 crc kubenswrapper[4867]: I0320 00:32:34.679126 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" event={"ID":"51581aac-eba3-4506-8565-9085cc1a2ab0","Type":"ContainerStarted","Data":"76c4ed870d033f8345fad6ca6aa60cd32d3e3c7c04a5f3213ddf287e3054c6b7"} Mar 20 00:32:34 crc kubenswrapper[4867]: I0320 00:32:34.679168 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" event={"ID":"51581aac-eba3-4506-8565-9085cc1a2ab0","Type":"ContainerStarted","Data":"5fa542a408cc8da73fdc451a3e1efc096e57d4a06585fc8e22cfc5dae3c0916c"} Mar 20 00:32:34 crc kubenswrapper[4867]: I0320 00:32:34.683020 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" event={"ID":"61034c06-0be0-421a-86fe-88414af211f7","Type":"ContainerStarted","Data":"c38dde625dc09076543a5258fd43f27444b0dad5181138776a353139e5c5b819"} Mar 20 00:32:34 crc kubenswrapper[4867]: I0320 00:32:34.683057 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" event={"ID":"61034c06-0be0-421a-86fe-88414af211f7","Type":"ContainerStarted","Data":"1fb810e5c29127437a018d98ba7bf5d845da00e0e067b1f2d3f9e14f2cf84fba"} Mar 20 00:32:34 crc kubenswrapper[4867]: I0320 00:32:34.684684 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" event={"ID":"7c011887-4b7c-450a-8f02-40c60be82359","Type":"ContainerStarted","Data":"c671eb5a19941904250d4684f0d4ea344b44fdfe09b0d152ae01f41c1a82bf90"} Mar 20 00:32:34 crc kubenswrapper[4867]: I0320 00:32:34.684710 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" event={"ID":"7c011887-4b7c-450a-8f02-40c60be82359","Type":"ContainerStarted","Data":"89a162386e81b76ec42d9fbc802a36f2ee332855995eb1ffcc1a4bb95dce38c1"} Mar 20 00:32:34 crc kubenswrapper[4867]: I0320 00:32:34.696136 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=4.72779628 podStartE2EDuration="36.696120531s" podCreationTimestamp="2026-03-20 00:31:58 +0000 UTC" firstStartedPulling="2026-03-20 00:32:01.550062565 +0000 UTC m=+1535.776600082" lastFinishedPulling="2026-03-20 00:32:33.518386826 +0000 UTC m=+1567.744924333" observedRunningTime="2026-03-20 00:32:34.692026315 +0000 UTC m=+1568.918563832" watchObservedRunningTime="2026-03-20 00:32:34.696120531 +0000 UTC m=+1568.922658048" Mar 20 00:32:34 crc kubenswrapper[4867]: I0320 00:32:34.703604 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"a53d90d8-2c87-4869-ad34-dc5db259425d","Type":"ContainerStarted","Data":"a26c8fa15ace056507326667a3480b5e45667ea3db9a7c53f82b714801edd8a9"} Mar 20 00:32:34 crc kubenswrapper[4867]: I0320 00:32:34.726771 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=11.73750476 podStartE2EDuration="20.726756626s" podCreationTimestamp="2026-03-20 00:32:14 +0000 UTC" firstStartedPulling="2026-03-20 00:32:24.552408574 +0000 UTC m=+1558.778946101" lastFinishedPulling="2026-03-20 00:32:33.54166045 +0000 UTC m=+1567.768197967" observedRunningTime="2026-03-20 00:32:34.724730054 +0000 UTC m=+1568.951267571" watchObservedRunningTime="2026-03-20 00:32:34.726756626 +0000 UTC m=+1568.953294143" Mar 20 00:32:35 crc kubenswrapper[4867]: I0320 00:32:35.711908 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" event={"ID":"61034c06-0be0-421a-86fe-88414af211f7","Type":"ContainerStarted","Data":"66c4ad5fe1d9aaad7fd3291d12c35929c15d3bc0ced9ae2f6a4d0a6b6e9071e8"} Mar 20 00:32:35 crc kubenswrapper[4867]: I0320 00:32:35.715062 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" event={"ID":"7c011887-4b7c-450a-8f02-40c60be82359","Type":"ContainerStarted","Data":"e3fe40bc1b0f424f9cd2ffbb83790d51dcac9b1e96a2c29848edaa4d5985da12"} Mar 20 00:32:36 crc kubenswrapper[4867]: I0320 00:32:36.346688 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.224475 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf"] Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.225900 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.228340 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.228643 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.240372 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf"] Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.390697 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/13a8a336-a26c-43c2-9acc-6d03492f9f0e-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf\" (UID: \"13a8a336-a26c-43c2-9acc-6d03492f9f0e\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.390750 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/13a8a336-a26c-43c2-9acc-6d03492f9f0e-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf\" (UID: \"13a8a336-a26c-43c2-9acc-6d03492f9f0e\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.390790 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jslrn\" (UniqueName: \"kubernetes.io/projected/13a8a336-a26c-43c2-9acc-6d03492f9f0e-kube-api-access-jslrn\") pod \"default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf\" (UID: \"13a8a336-a26c-43c2-9acc-6d03492f9f0e\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.390838 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/13a8a336-a26c-43c2-9acc-6d03492f9f0e-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf\" (UID: \"13a8a336-a26c-43c2-9acc-6d03492f9f0e\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.492186 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jslrn\" (UniqueName: \"kubernetes.io/projected/13a8a336-a26c-43c2-9acc-6d03492f9f0e-kube-api-access-jslrn\") pod \"default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf\" (UID: \"13a8a336-a26c-43c2-9acc-6d03492f9f0e\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.492481 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/13a8a336-a26c-43c2-9acc-6d03492f9f0e-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf\" (UID: \"13a8a336-a26c-43c2-9acc-6d03492f9f0e\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.492558 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/13a8a336-a26c-43c2-9acc-6d03492f9f0e-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf\" (UID: \"13a8a336-a26c-43c2-9acc-6d03492f9f0e\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.492599 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/13a8a336-a26c-43c2-9acc-6d03492f9f0e-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf\" (UID: \"13a8a336-a26c-43c2-9acc-6d03492f9f0e\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.493232 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/13a8a336-a26c-43c2-9acc-6d03492f9f0e-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf\" (UID: \"13a8a336-a26c-43c2-9acc-6d03492f9f0e\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.493904 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/13a8a336-a26c-43c2-9acc-6d03492f9f0e-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf\" (UID: \"13a8a336-a26c-43c2-9acc-6d03492f9f0e\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.512011 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/13a8a336-a26c-43c2-9acc-6d03492f9f0e-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf\" (UID: \"13a8a336-a26c-43c2-9acc-6d03492f9f0e\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.518172 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jslrn\" (UniqueName: \"kubernetes.io/projected/13a8a336-a26c-43c2-9acc-6d03492f9f0e-kube-api-access-jslrn\") pod \"default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf\" (UID: \"13a8a336-a26c-43c2-9acc-6d03492f9f0e\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.549032 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" Mar 20 00:32:39 crc kubenswrapper[4867]: I0320 00:32:39.954084 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf"] Mar 20 00:32:39 crc kubenswrapper[4867]: W0320 00:32:39.969620 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13a8a336_a26c_43c2_9acc_6d03492f9f0e.slice/crio-b805ed6d49d4606d50555ff7e6a627b24f0d7a4064273538f8dadbf840c4d066 WatchSource:0}: Error finding container b805ed6d49d4606d50555ff7e6a627b24f0d7a4064273538f8dadbf840c4d066: Status 404 returned error can't find the container with id b805ed6d49d4606d50555ff7e6a627b24f0d7a4064273538f8dadbf840c4d066 Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.122973 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls"] Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.124783 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.131746 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.140460 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls"] Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.301950 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/52172008-2d38-4ef9-81f0-ddafa1f96be7-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls\" (UID: \"52172008-2d38-4ef9-81f0-ddafa1f96be7\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.302020 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/52172008-2d38-4ef9-81f0-ddafa1f96be7-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls\" (UID: \"52172008-2d38-4ef9-81f0-ddafa1f96be7\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.302121 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/52172008-2d38-4ef9-81f0-ddafa1f96be7-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls\" (UID: \"52172008-2d38-4ef9-81f0-ddafa1f96be7\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.302164 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drnz4\" (UniqueName: \"kubernetes.io/projected/52172008-2d38-4ef9-81f0-ddafa1f96be7-kube-api-access-drnz4\") pod \"default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls\" (UID: \"52172008-2d38-4ef9-81f0-ddafa1f96be7\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.403660 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drnz4\" (UniqueName: \"kubernetes.io/projected/52172008-2d38-4ef9-81f0-ddafa1f96be7-kube-api-access-drnz4\") pod \"default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls\" (UID: \"52172008-2d38-4ef9-81f0-ddafa1f96be7\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.403821 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/52172008-2d38-4ef9-81f0-ddafa1f96be7-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls\" (UID: \"52172008-2d38-4ef9-81f0-ddafa1f96be7\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.403858 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/52172008-2d38-4ef9-81f0-ddafa1f96be7-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls\" (UID: \"52172008-2d38-4ef9-81f0-ddafa1f96be7\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.403921 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/52172008-2d38-4ef9-81f0-ddafa1f96be7-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls\" (UID: \"52172008-2d38-4ef9-81f0-ddafa1f96be7\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.404943 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/52172008-2d38-4ef9-81f0-ddafa1f96be7-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls\" (UID: \"52172008-2d38-4ef9-81f0-ddafa1f96be7\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.405427 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/52172008-2d38-4ef9-81f0-ddafa1f96be7-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls\" (UID: \"52172008-2d38-4ef9-81f0-ddafa1f96be7\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.409243 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/52172008-2d38-4ef9-81f0-ddafa1f96be7-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls\" (UID: \"52172008-2d38-4ef9-81f0-ddafa1f96be7\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.420358 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drnz4\" (UniqueName: \"kubernetes.io/projected/52172008-2d38-4ef9-81f0-ddafa1f96be7-kube-api-access-drnz4\") pod \"default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls\" (UID: \"52172008-2d38-4ef9-81f0-ddafa1f96be7\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.442014 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.752623 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" event={"ID":"51581aac-eba3-4506-8565-9085cc1a2ab0","Type":"ContainerStarted","Data":"594efb5aa954921f0f7166d2cd40e492d4c1435c8005a7c454378300bc303376"} Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.755356 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" event={"ID":"61034c06-0be0-421a-86fe-88414af211f7","Type":"ContainerStarted","Data":"5f784c3976e74b7489db5093759cc67cc0e1144590c7eacf0f4dee5e8a259de2"} Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.757960 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" event={"ID":"7c011887-4b7c-450a-8f02-40c60be82359","Type":"ContainerStarted","Data":"796138ceac42acbf13b380000704f3afe0606f139162db66dc2e84e830ba22c6"} Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.759773 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" event={"ID":"13a8a336-a26c-43c2-9acc-6d03492f9f0e","Type":"ContainerStarted","Data":"2aad850cf31a2bf5c9219b815573bb4c2041d9e1fdaf77989e9aac6149ca641e"} Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.759804 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" event={"ID":"13a8a336-a26c-43c2-9acc-6d03492f9f0e","Type":"ContainerStarted","Data":"d0c07f12b59280101cd68c8231f2a75f1598b81941276974e30ab19823f4558f"} Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.759815 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" event={"ID":"13a8a336-a26c-43c2-9acc-6d03492f9f0e","Type":"ContainerStarted","Data":"b805ed6d49d4606d50555ff7e6a627b24f0d7a4064273538f8dadbf840c4d066"} Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.775845 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" podStartSLOduration=4.185405642 podStartE2EDuration="16.775826989s" podCreationTimestamp="2026-03-20 00:32:24 +0000 UTC" firstStartedPulling="2026-03-20 00:32:27.354501214 +0000 UTC m=+1561.581038731" lastFinishedPulling="2026-03-20 00:32:39.944922561 +0000 UTC m=+1574.171460078" observedRunningTime="2026-03-20 00:32:40.769745531 +0000 UTC m=+1574.996283058" watchObservedRunningTime="2026-03-20 00:32:40.775826989 +0000 UTC m=+1575.002364526" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.802869 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" podStartSLOduration=3.9287215680000003 podStartE2EDuration="9.80285373s" podCreationTimestamp="2026-03-20 00:32:31 +0000 UTC" firstStartedPulling="2026-03-20 00:32:34.024187129 +0000 UTC m=+1568.250724646" lastFinishedPulling="2026-03-20 00:32:39.898319291 +0000 UTC m=+1574.124856808" observedRunningTime="2026-03-20 00:32:40.798101776 +0000 UTC m=+1575.024639313" watchObservedRunningTime="2026-03-20 00:32:40.80285373 +0000 UTC m=+1575.029391237" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.829054 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" podStartSLOduration=7.454558065 podStartE2EDuration="13.829038609s" podCreationTimestamp="2026-03-20 00:32:27 +0000 UTC" firstStartedPulling="2026-03-20 00:32:33.827927627 +0000 UTC m=+1568.054465134" lastFinishedPulling="2026-03-20 00:32:40.202408161 +0000 UTC m=+1574.428945678" observedRunningTime="2026-03-20 00:32:40.823735802 +0000 UTC m=+1575.050273339" watchObservedRunningTime="2026-03-20 00:32:40.829038609 +0000 UTC m=+1575.055576126" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.847771 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" podStartSLOduration=1.3186486579999999 podStartE2EDuration="1.847750765s" podCreationTimestamp="2026-03-20 00:32:39 +0000 UTC" firstStartedPulling="2026-03-20 00:32:39.973250196 +0000 UTC m=+1574.199787713" lastFinishedPulling="2026-03-20 00:32:40.502352303 +0000 UTC m=+1574.728889820" observedRunningTime="2026-03-20 00:32:40.846313147 +0000 UTC m=+1575.072850664" watchObservedRunningTime="2026-03-20 00:32:40.847750765 +0000 UTC m=+1575.074288282" Mar 20 00:32:40 crc kubenswrapper[4867]: I0320 00:32:40.883715 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls"] Mar 20 00:32:41 crc kubenswrapper[4867]: I0320 00:32:41.768815 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" event={"ID":"52172008-2d38-4ef9-81f0-ddafa1f96be7","Type":"ContainerStarted","Data":"ea15a388e3631bc6506f65a1a61399bb1eacddc1fd96b624cf33e10013f20acf"} Mar 20 00:32:41 crc kubenswrapper[4867]: I0320 00:32:41.769370 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" event={"ID":"52172008-2d38-4ef9-81f0-ddafa1f96be7","Type":"ContainerStarted","Data":"c85b848aaa2624a7191c03c6255af6edb23c5b14586d4b06ce3708d65a064fdb"} Mar 20 00:32:41 crc kubenswrapper[4867]: I0320 00:32:41.769386 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" event={"ID":"52172008-2d38-4ef9-81f0-ddafa1f96be7","Type":"ContainerStarted","Data":"d0b535d22b0930c18d389dea20d1ce6aa02e543158d8d1ca1b90db0b3b673301"} Mar 20 00:32:41 crc kubenswrapper[4867]: I0320 00:32:41.784509 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" podStartSLOduration=1.386543983 podStartE2EDuration="1.784472597s" podCreationTimestamp="2026-03-20 00:32:40 +0000 UTC" firstStartedPulling="2026-03-20 00:32:40.890088073 +0000 UTC m=+1575.116625590" lastFinishedPulling="2026-03-20 00:32:41.288016687 +0000 UTC m=+1575.514554204" observedRunningTime="2026-03-20 00:32:41.781450129 +0000 UTC m=+1576.007987646" watchObservedRunningTime="2026-03-20 00:32:41.784472597 +0000 UTC m=+1576.011010134" Mar 20 00:32:46 crc kubenswrapper[4867]: I0320 00:32:46.347809 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Mar 20 00:32:46 crc kubenswrapper[4867]: I0320 00:32:46.385690 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Mar 20 00:32:46 crc kubenswrapper[4867]: I0320 00:32:46.855660 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Mar 20 00:32:51 crc kubenswrapper[4867]: I0320 00:32:51.163920 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-txgjt"] Mar 20 00:32:51 crc kubenswrapper[4867]: I0320 00:32:51.164469 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-txgjt" podUID="780cb59f-bd17-4227-92b0-8f9a8ffbddfc" containerName="default-interconnect" containerID="cri-o://5d60e91a67c17c63cf8f0f3682930a0bb1ba6d31565db97d6c0ba86b29c36346" gracePeriod=30 Mar 20 00:32:51 crc kubenswrapper[4867]: I0320 00:32:51.840624 4867 generic.go:334] "Generic (PLEG): container finished" podID="780cb59f-bd17-4227-92b0-8f9a8ffbddfc" containerID="5d60e91a67c17c63cf8f0f3682930a0bb1ba6d31565db97d6c0ba86b29c36346" exitCode=0 Mar 20 00:32:51 crc kubenswrapper[4867]: I0320 00:32:51.840719 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-txgjt" event={"ID":"780cb59f-bd17-4227-92b0-8f9a8ffbddfc","Type":"ContainerDied","Data":"5d60e91a67c17c63cf8f0f3682930a0bb1ba6d31565db97d6c0ba86b29c36346"} Mar 20 00:32:51 crc kubenswrapper[4867]: I0320 00:32:51.843161 4867 generic.go:334] "Generic (PLEG): container finished" podID="13a8a336-a26c-43c2-9acc-6d03492f9f0e" containerID="d0c07f12b59280101cd68c8231f2a75f1598b81941276974e30ab19823f4558f" exitCode=0 Mar 20 00:32:51 crc kubenswrapper[4867]: I0320 00:32:51.843207 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" event={"ID":"13a8a336-a26c-43c2-9acc-6d03492f9f0e","Type":"ContainerDied","Data":"d0c07f12b59280101cd68c8231f2a75f1598b81941276974e30ab19823f4558f"} Mar 20 00:32:51 crc kubenswrapper[4867]: I0320 00:32:51.843682 4867 scope.go:117] "RemoveContainer" containerID="d0c07f12b59280101cd68c8231f2a75f1598b81941276974e30ab19823f4558f" Mar 20 00:32:51 crc kubenswrapper[4867]: I0320 00:32:51.848241 4867 generic.go:334] "Generic (PLEG): container finished" podID="52172008-2d38-4ef9-81f0-ddafa1f96be7" containerID="c85b848aaa2624a7191c03c6255af6edb23c5b14586d4b06ce3708d65a064fdb" exitCode=0 Mar 20 00:32:51 crc kubenswrapper[4867]: I0320 00:32:51.848281 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" event={"ID":"52172008-2d38-4ef9-81f0-ddafa1f96be7","Type":"ContainerDied","Data":"c85b848aaa2624a7191c03c6255af6edb23c5b14586d4b06ce3708d65a064fdb"} Mar 20 00:32:51 crc kubenswrapper[4867]: I0320 00:32:51.849098 4867 scope.go:117] "RemoveContainer" containerID="c85b848aaa2624a7191c03c6255af6edb23c5b14586d4b06ce3708d65a064fdb" Mar 20 00:32:52 crc kubenswrapper[4867]: I0320 00:32:52.883139 4867 generic.go:334] "Generic (PLEG): container finished" podID="51581aac-eba3-4506-8565-9085cc1a2ab0" containerID="76c4ed870d033f8345fad6ca6aa60cd32d3e3c7c04a5f3213ddf287e3054c6b7" exitCode=0 Mar 20 00:32:52 crc kubenswrapper[4867]: I0320 00:32:52.883208 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" event={"ID":"51581aac-eba3-4506-8565-9085cc1a2ab0","Type":"ContainerDied","Data":"76c4ed870d033f8345fad6ca6aa60cd32d3e3c7c04a5f3213ddf287e3054c6b7"} Mar 20 00:32:52 crc kubenswrapper[4867]: I0320 00:32:52.883818 4867 scope.go:117] "RemoveContainer" containerID="76c4ed870d033f8345fad6ca6aa60cd32d3e3c7c04a5f3213ddf287e3054c6b7" Mar 20 00:32:52 crc kubenswrapper[4867]: I0320 00:32:52.900528 4867 generic.go:334] "Generic (PLEG): container finished" podID="61034c06-0be0-421a-86fe-88414af211f7" containerID="66c4ad5fe1d9aaad7fd3291d12c35929c15d3bc0ced9ae2f6a4d0a6b6e9071e8" exitCode=0 Mar 20 00:32:52 crc kubenswrapper[4867]: I0320 00:32:52.900610 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" event={"ID":"61034c06-0be0-421a-86fe-88414af211f7","Type":"ContainerDied","Data":"66c4ad5fe1d9aaad7fd3291d12c35929c15d3bc0ced9ae2f6a4d0a6b6e9071e8"} Mar 20 00:32:52 crc kubenswrapper[4867]: I0320 00:32:52.901131 4867 scope.go:117] "RemoveContainer" containerID="66c4ad5fe1d9aaad7fd3291d12c35929c15d3bc0ced9ae2f6a4d0a6b6e9071e8" Mar 20 00:32:52 crc kubenswrapper[4867]: I0320 00:32:52.905390 4867 generic.go:334] "Generic (PLEG): container finished" podID="7c011887-4b7c-450a-8f02-40c60be82359" containerID="e3fe40bc1b0f424f9cd2ffbb83790d51dcac9b1e96a2c29848edaa4d5985da12" exitCode=0 Mar 20 00:32:52 crc kubenswrapper[4867]: I0320 00:32:52.905421 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" event={"ID":"7c011887-4b7c-450a-8f02-40c60be82359","Type":"ContainerDied","Data":"e3fe40bc1b0f424f9cd2ffbb83790d51dcac9b1e96a2c29848edaa4d5985da12"} Mar 20 00:32:52 crc kubenswrapper[4867]: I0320 00:32:52.908976 4867 scope.go:117] "RemoveContainer" containerID="e3fe40bc1b0f424f9cd2ffbb83790d51dcac9b1e96a2c29848edaa4d5985da12" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.027076 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.081994 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-l6fnr"] Mar 20 00:32:53 crc kubenswrapper[4867]: E0320 00:32:53.082313 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="780cb59f-bd17-4227-92b0-8f9a8ffbddfc" containerName="default-interconnect" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.082328 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="780cb59f-bd17-4227-92b0-8f9a8ffbddfc" containerName="default-interconnect" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.082452 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="780cb59f-bd17-4227-92b0-8f9a8ffbddfc" containerName="default-interconnect" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.082890 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.088977 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-l6fnr"] Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.200051 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-inter-router-ca\") pod \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.200143 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-openstack-credentials\") pod \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.200172 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-sasl-users\") pod \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.200197 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-inter-router-credentials\") pod \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.200226 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-openstack-ca\") pod \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.200255 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-sasl-config\") pod \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.200289 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g76xv\" (UniqueName: \"kubernetes.io/projected/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-kube-api-access-g76xv\") pod \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\" (UID: \"780cb59f-bd17-4227-92b0-8f9a8ffbddfc\") " Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.200542 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/063672f7-25c6-47fb-a6fc-8149c97062d5-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.200602 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/063672f7-25c6-47fb-a6fc-8149c97062d5-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.200625 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/063672f7-25c6-47fb-a6fc-8149c97062d5-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.200673 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/063672f7-25c6-47fb-a6fc-8149c97062d5-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.200767 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/063672f7-25c6-47fb-a6fc-8149c97062d5-sasl-users\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.200796 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/063672f7-25c6-47fb-a6fc-8149c97062d5-sasl-config\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.200833 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-675hp\" (UniqueName: \"kubernetes.io/projected/063672f7-25c6-47fb-a6fc-8149c97062d5-kube-api-access-675hp\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.207062 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "780cb59f-bd17-4227-92b0-8f9a8ffbddfc" (UID: "780cb59f-bd17-4227-92b0-8f9a8ffbddfc"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.208471 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "780cb59f-bd17-4227-92b0-8f9a8ffbddfc" (UID: "780cb59f-bd17-4227-92b0-8f9a8ffbddfc"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.208503 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "780cb59f-bd17-4227-92b0-8f9a8ffbddfc" (UID: "780cb59f-bd17-4227-92b0-8f9a8ffbddfc"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.209723 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "780cb59f-bd17-4227-92b0-8f9a8ffbddfc" (UID: "780cb59f-bd17-4227-92b0-8f9a8ffbddfc"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.209864 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "780cb59f-bd17-4227-92b0-8f9a8ffbddfc" (UID: "780cb59f-bd17-4227-92b0-8f9a8ffbddfc"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.213699 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "780cb59f-bd17-4227-92b0-8f9a8ffbddfc" (UID: "780cb59f-bd17-4227-92b0-8f9a8ffbddfc"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.213720 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-kube-api-access-g76xv" (OuterVolumeSpecName: "kube-api-access-g76xv") pod "780cb59f-bd17-4227-92b0-8f9a8ffbddfc" (UID: "780cb59f-bd17-4227-92b0-8f9a8ffbddfc"). InnerVolumeSpecName "kube-api-access-g76xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.302439 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/063672f7-25c6-47fb-a6fc-8149c97062d5-sasl-users\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.302481 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/063672f7-25c6-47fb-a6fc-8149c97062d5-sasl-config\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.302539 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-675hp\" (UniqueName: \"kubernetes.io/projected/063672f7-25c6-47fb-a6fc-8149c97062d5-kube-api-access-675hp\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.302588 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/063672f7-25c6-47fb-a6fc-8149c97062d5-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.302620 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/063672f7-25c6-47fb-a6fc-8149c97062d5-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.302637 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/063672f7-25c6-47fb-a6fc-8149c97062d5-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.302671 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/063672f7-25c6-47fb-a6fc-8149c97062d5-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.302720 4867 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.302731 4867 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.302741 4867 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-sasl-users\") on node \"crc\" DevicePath \"\"" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.302750 4867 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.302759 4867 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.302768 4867 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-sasl-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.302777 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g76xv\" (UniqueName: \"kubernetes.io/projected/780cb59f-bd17-4227-92b0-8f9a8ffbddfc-kube-api-access-g76xv\") on node \"crc\" DevicePath \"\"" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.307815 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/063672f7-25c6-47fb-a6fc-8149c97062d5-sasl-config\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.308064 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/063672f7-25c6-47fb-a6fc-8149c97062d5-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.314332 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/063672f7-25c6-47fb-a6fc-8149c97062d5-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.314357 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/063672f7-25c6-47fb-a6fc-8149c97062d5-sasl-users\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.316168 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/063672f7-25c6-47fb-a6fc-8149c97062d5-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.316949 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/063672f7-25c6-47fb-a6fc-8149c97062d5-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.327606 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-675hp\" (UniqueName: \"kubernetes.io/projected/063672f7-25c6-47fb-a6fc-8149c97062d5-kube-api-access-675hp\") pod \"default-interconnect-68864d46cb-l6fnr\" (UID: \"063672f7-25c6-47fb-a6fc-8149c97062d5\") " pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.401603 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.833245 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-l6fnr"] Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.914246 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-txgjt" event={"ID":"780cb59f-bd17-4227-92b0-8f9a8ffbddfc","Type":"ContainerDied","Data":"ce27f3a9633d78b2d52c61f93b843f300065265b6997529058096cd4e92d0b16"} Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.914297 4867 scope.go:117] "RemoveContainer" containerID="5d60e91a67c17c63cf8f0f3682930a0bb1ba6d31565db97d6c0ba86b29c36346" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.914422 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-txgjt" Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.921938 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" event={"ID":"7c011887-4b7c-450a-8f02-40c60be82359","Type":"ContainerStarted","Data":"d7bc3d4e567f0aa1ecba4d78e849a651af92f74c30019c79e30ed055f362ce3f"} Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.927577 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" event={"ID":"13a8a336-a26c-43c2-9acc-6d03492f9f0e","Type":"ContainerStarted","Data":"cb4e6222c5acb79ad9957d48076ea41abc8ac066cd778968263132485f524183"} Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.930338 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" event={"ID":"51581aac-eba3-4506-8565-9085cc1a2ab0","Type":"ContainerStarted","Data":"0066291be27dd7ff57928c279db066498df708c8e69cb593c6afb28416b71c95"} Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.931437 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" event={"ID":"063672f7-25c6-47fb-a6fc-8149c97062d5","Type":"ContainerStarted","Data":"0ae7f16d388797154bd408a4c0d7915442cfa61f000bcc4118f23cf22d461a3b"} Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.945375 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" event={"ID":"61034c06-0be0-421a-86fe-88414af211f7","Type":"ContainerStarted","Data":"a1249afea10ed65e194ae7b4499e1334bbfd3519ecc156b3e42047244278bc55"} Mar 20 00:32:53 crc kubenswrapper[4867]: I0320 00:32:53.967372 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" event={"ID":"52172008-2d38-4ef9-81f0-ddafa1f96be7","Type":"ContainerStarted","Data":"76c059a3f0900bd9728ed8a77d62b7576913f48efd27fef69be139128262036d"} Mar 20 00:32:54 crc kubenswrapper[4867]: I0320 00:32:54.036358 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-txgjt"] Mar 20 00:32:54 crc kubenswrapper[4867]: I0320 00:32:54.046654 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-txgjt"] Mar 20 00:32:54 crc kubenswrapper[4867]: I0320 00:32:54.429808 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="780cb59f-bd17-4227-92b0-8f9a8ffbddfc" path="/var/lib/kubelet/pods/780cb59f-bd17-4227-92b0-8f9a8ffbddfc/volumes" Mar 20 00:32:54 crc kubenswrapper[4867]: I0320 00:32:54.899462 4867 scope.go:117] "RemoveContainer" containerID="b5fe49a7c6e1b1e2f6a82c5215a467fbd1eece2c10f9f313630fdeaa46ee28a0" Mar 20 00:32:54 crc kubenswrapper[4867]: I0320 00:32:54.977619 4867 generic.go:334] "Generic (PLEG): container finished" podID="7c011887-4b7c-450a-8f02-40c60be82359" containerID="d7bc3d4e567f0aa1ecba4d78e849a651af92f74c30019c79e30ed055f362ce3f" exitCode=0 Mar 20 00:32:54 crc kubenswrapper[4867]: I0320 00:32:54.977674 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" event={"ID":"7c011887-4b7c-450a-8f02-40c60be82359","Type":"ContainerDied","Data":"d7bc3d4e567f0aa1ecba4d78e849a651af92f74c30019c79e30ed055f362ce3f"} Mar 20 00:32:54 crc kubenswrapper[4867]: I0320 00:32:54.977725 4867 scope.go:117] "RemoveContainer" containerID="e3fe40bc1b0f424f9cd2ffbb83790d51dcac9b1e96a2c29848edaa4d5985da12" Mar 20 00:32:54 crc kubenswrapper[4867]: I0320 00:32:54.978275 4867 scope.go:117] "RemoveContainer" containerID="d7bc3d4e567f0aa1ecba4d78e849a651af92f74c30019c79e30ed055f362ce3f" Mar 20 00:32:54 crc kubenswrapper[4867]: E0320 00:32:54.978591 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6_service-telemetry(7c011887-4b7c-450a-8f02-40c60be82359)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" podUID="7c011887-4b7c-450a-8f02-40c60be82359" Mar 20 00:32:54 crc kubenswrapper[4867]: I0320 00:32:54.982037 4867 generic.go:334] "Generic (PLEG): container finished" podID="51581aac-eba3-4506-8565-9085cc1a2ab0" containerID="0066291be27dd7ff57928c279db066498df708c8e69cb593c6afb28416b71c95" exitCode=0 Mar 20 00:32:54 crc kubenswrapper[4867]: I0320 00:32:54.982109 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" event={"ID":"51581aac-eba3-4506-8565-9085cc1a2ab0","Type":"ContainerDied","Data":"0066291be27dd7ff57928c279db066498df708c8e69cb593c6afb28416b71c95"} Mar 20 00:32:54 crc kubenswrapper[4867]: I0320 00:32:54.982425 4867 scope.go:117] "RemoveContainer" containerID="0066291be27dd7ff57928c279db066498df708c8e69cb593c6afb28416b71c95" Mar 20 00:32:54 crc kubenswrapper[4867]: E0320 00:32:54.982620 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m_service-telemetry(51581aac-eba3-4506-8565-9085cc1a2ab0)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" podUID="51581aac-eba3-4506-8565-9085cc1a2ab0" Mar 20 00:32:54 crc kubenswrapper[4867]: I0320 00:32:54.983812 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" event={"ID":"063672f7-25c6-47fb-a6fc-8149c97062d5","Type":"ContainerStarted","Data":"256f525f3188cd4b039264f91e3f0a4d8f109b49d52e5793518fe8ee6963c72a"} Mar 20 00:32:54 crc kubenswrapper[4867]: I0320 00:32:54.986188 4867 generic.go:334] "Generic (PLEG): container finished" podID="61034c06-0be0-421a-86fe-88414af211f7" containerID="a1249afea10ed65e194ae7b4499e1334bbfd3519ecc156b3e42047244278bc55" exitCode=0 Mar 20 00:32:54 crc kubenswrapper[4867]: I0320 00:32:54.986231 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" event={"ID":"61034c06-0be0-421a-86fe-88414af211f7","Type":"ContainerDied","Data":"a1249afea10ed65e194ae7b4499e1334bbfd3519ecc156b3e42047244278bc55"} Mar 20 00:32:54 crc kubenswrapper[4867]: I0320 00:32:54.986474 4867 scope.go:117] "RemoveContainer" containerID="a1249afea10ed65e194ae7b4499e1334bbfd3519ecc156b3e42047244278bc55" Mar 20 00:32:54 crc kubenswrapper[4867]: E0320 00:32:54.986641 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w_service-telemetry(61034c06-0be0-421a-86fe-88414af211f7)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" podUID="61034c06-0be0-421a-86fe-88414af211f7" Mar 20 00:32:54 crc kubenswrapper[4867]: I0320 00:32:54.988403 4867 generic.go:334] "Generic (PLEG): container finished" podID="52172008-2d38-4ef9-81f0-ddafa1f96be7" containerID="76c059a3f0900bd9728ed8a77d62b7576913f48efd27fef69be139128262036d" exitCode=0 Mar 20 00:32:54 crc kubenswrapper[4867]: I0320 00:32:54.988432 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" event={"ID":"52172008-2d38-4ef9-81f0-ddafa1f96be7","Type":"ContainerDied","Data":"76c059a3f0900bd9728ed8a77d62b7576913f48efd27fef69be139128262036d"} Mar 20 00:32:54 crc kubenswrapper[4867]: I0320 00:32:54.988685 4867 scope.go:117] "RemoveContainer" containerID="76c059a3f0900bd9728ed8a77d62b7576913f48efd27fef69be139128262036d" Mar 20 00:32:54 crc kubenswrapper[4867]: E0320 00:32:54.988841 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls_service-telemetry(52172008-2d38-4ef9-81f0-ddafa1f96be7)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" podUID="52172008-2d38-4ef9-81f0-ddafa1f96be7" Mar 20 00:32:55 crc kubenswrapper[4867]: I0320 00:32:55.046254 4867 scope.go:117] "RemoveContainer" containerID="76c4ed870d033f8345fad6ca6aa60cd32d3e3c7c04a5f3213ddf287e3054c6b7" Mar 20 00:32:55 crc kubenswrapper[4867]: I0320 00:32:55.052001 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-l6fnr" podStartSLOduration=4.051987931 podStartE2EDuration="4.051987931s" podCreationTimestamp="2026-03-20 00:32:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 00:32:55.049931338 +0000 UTC m=+1589.276468855" watchObservedRunningTime="2026-03-20 00:32:55.051987931 +0000 UTC m=+1589.278525448" Mar 20 00:32:55 crc kubenswrapper[4867]: I0320 00:32:55.127869 4867 scope.go:117] "RemoveContainer" containerID="66c4ad5fe1d9aaad7fd3291d12c35929c15d3bc0ced9ae2f6a4d0a6b6e9071e8" Mar 20 00:32:55 crc kubenswrapper[4867]: I0320 00:32:55.213664 4867 scope.go:117] "RemoveContainer" containerID="c85b848aaa2624a7191c03c6255af6edb23c5b14586d4b06ce3708d65a064fdb" Mar 20 00:32:56 crc kubenswrapper[4867]: I0320 00:32:56.006272 4867 generic.go:334] "Generic (PLEG): container finished" podID="13a8a336-a26c-43c2-9acc-6d03492f9f0e" containerID="cb4e6222c5acb79ad9957d48076ea41abc8ac066cd778968263132485f524183" exitCode=0 Mar 20 00:32:56 crc kubenswrapper[4867]: I0320 00:32:56.006301 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" event={"ID":"13a8a336-a26c-43c2-9acc-6d03492f9f0e","Type":"ContainerDied","Data":"cb4e6222c5acb79ad9957d48076ea41abc8ac066cd778968263132485f524183"} Mar 20 00:32:56 crc kubenswrapper[4867]: I0320 00:32:56.006367 4867 scope.go:117] "RemoveContainer" containerID="d0c07f12b59280101cd68c8231f2a75f1598b81941276974e30ab19823f4558f" Mar 20 00:32:56 crc kubenswrapper[4867]: I0320 00:32:56.006910 4867 scope.go:117] "RemoveContainer" containerID="cb4e6222c5acb79ad9957d48076ea41abc8ac066cd778968263132485f524183" Mar 20 00:32:56 crc kubenswrapper[4867]: E0320 00:32:56.007208 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf_service-telemetry(13a8a336-a26c-43c2-9acc-6d03492f9f0e)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" podUID="13a8a336-a26c-43c2-9acc-6d03492f9f0e" Mar 20 00:32:56 crc kubenswrapper[4867]: I0320 00:32:56.903263 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Mar 20 00:32:56 crc kubenswrapper[4867]: I0320 00:32:56.904419 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 20 00:32:56 crc kubenswrapper[4867]: I0320 00:32:56.906456 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Mar 20 00:32:56 crc kubenswrapper[4867]: I0320 00:32:56.914677 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 20 00:32:56 crc kubenswrapper[4867]: I0320 00:32:56.916256 4867 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Mar 20 00:32:57 crc kubenswrapper[4867]: I0320 00:32:57.098560 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/0fb658d4-90b0-4a29-9892-5baac55ad5e5-qdr-test-config\") pod \"qdr-test\" (UID: \"0fb658d4-90b0-4a29-9892-5baac55ad5e5\") " pod="service-telemetry/qdr-test" Mar 20 00:32:57 crc kubenswrapper[4867]: I0320 00:32:57.098623 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/0fb658d4-90b0-4a29-9892-5baac55ad5e5-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"0fb658d4-90b0-4a29-9892-5baac55ad5e5\") " pod="service-telemetry/qdr-test" Mar 20 00:32:57 crc kubenswrapper[4867]: I0320 00:32:57.098671 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsdhp\" (UniqueName: \"kubernetes.io/projected/0fb658d4-90b0-4a29-9892-5baac55ad5e5-kube-api-access-nsdhp\") pod \"qdr-test\" (UID: \"0fb658d4-90b0-4a29-9892-5baac55ad5e5\") " pod="service-telemetry/qdr-test" Mar 20 00:32:57 crc kubenswrapper[4867]: I0320 00:32:57.200076 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/0fb658d4-90b0-4a29-9892-5baac55ad5e5-qdr-test-config\") pod \"qdr-test\" (UID: \"0fb658d4-90b0-4a29-9892-5baac55ad5e5\") " pod="service-telemetry/qdr-test" Mar 20 00:32:57 crc kubenswrapper[4867]: I0320 00:32:57.200332 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/0fb658d4-90b0-4a29-9892-5baac55ad5e5-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"0fb658d4-90b0-4a29-9892-5baac55ad5e5\") " pod="service-telemetry/qdr-test" Mar 20 00:32:57 crc kubenswrapper[4867]: I0320 00:32:57.200446 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsdhp\" (UniqueName: \"kubernetes.io/projected/0fb658d4-90b0-4a29-9892-5baac55ad5e5-kube-api-access-nsdhp\") pod \"qdr-test\" (UID: \"0fb658d4-90b0-4a29-9892-5baac55ad5e5\") " pod="service-telemetry/qdr-test" Mar 20 00:32:57 crc kubenswrapper[4867]: I0320 00:32:57.201080 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/0fb658d4-90b0-4a29-9892-5baac55ad5e5-qdr-test-config\") pod \"qdr-test\" (UID: \"0fb658d4-90b0-4a29-9892-5baac55ad5e5\") " pod="service-telemetry/qdr-test" Mar 20 00:32:57 crc kubenswrapper[4867]: I0320 00:32:57.206034 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/0fb658d4-90b0-4a29-9892-5baac55ad5e5-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"0fb658d4-90b0-4a29-9892-5baac55ad5e5\") " pod="service-telemetry/qdr-test" Mar 20 00:32:57 crc kubenswrapper[4867]: I0320 00:32:57.222663 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsdhp\" (UniqueName: \"kubernetes.io/projected/0fb658d4-90b0-4a29-9892-5baac55ad5e5-kube-api-access-nsdhp\") pod \"qdr-test\" (UID: \"0fb658d4-90b0-4a29-9892-5baac55ad5e5\") " pod="service-telemetry/qdr-test" Mar 20 00:32:57 crc kubenswrapper[4867]: I0320 00:32:57.521944 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 20 00:32:57 crc kubenswrapper[4867]: I0320 00:32:57.951263 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 20 00:32:58 crc kubenswrapper[4867]: I0320 00:32:58.025373 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"0fb658d4-90b0-4a29-9892-5baac55ad5e5","Type":"ContainerStarted","Data":"7032df64d49a78fde4195fcfc7b3495b1cc9edbaf93f6d374d0a04d6bd297a24"} Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.075928 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"0fb658d4-90b0-4a29-9892-5baac55ad5e5","Type":"ContainerStarted","Data":"c8cefdc41ab2d056cb0a788e07c4610d497269795ea3dc5cca97d236a290cc88"} Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.091895 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=2.228521696 podStartE2EDuration="10.091877379s" podCreationTimestamp="2026-03-20 00:32:56 +0000 UTC" firstStartedPulling="2026-03-20 00:32:57.964526416 +0000 UTC m=+1592.191063933" lastFinishedPulling="2026-03-20 00:33:05.827882099 +0000 UTC m=+1600.054419616" observedRunningTime="2026-03-20 00:33:06.086753516 +0000 UTC m=+1600.313291043" watchObservedRunningTime="2026-03-20 00:33:06.091877379 +0000 UTC m=+1600.318414886" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.352704 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-ddh68"] Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.354306 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.356388 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.356722 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.357128 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.357418 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.357718 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.358149 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.366417 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-ddh68"] Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.426867 4867 scope.go:117] "RemoveContainer" containerID="76c059a3f0900bd9728ed8a77d62b7576913f48efd27fef69be139128262036d" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.428443 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.551687 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4cfs\" (UniqueName: \"kubernetes.io/projected/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-kube-api-access-x4cfs\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.551750 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.552261 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-ceilometer-publisher\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.552294 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-healthcheck-log\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.552352 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-collectd-config\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.552539 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.552591 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-sensubility-config\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.654232 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.654286 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-sensubility-config\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.654351 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4cfs\" (UniqueName: \"kubernetes.io/projected/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-kube-api-access-x4cfs\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.654380 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.654411 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-ceilometer-publisher\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.654436 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-healthcheck-log\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.654482 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-collectd-config\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.655482 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.655817 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-sensubility-config\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.655869 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-ceilometer-publisher\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.656039 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-healthcheck-log\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.656197 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.656454 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-collectd-config\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.696924 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4cfs\" (UniqueName: \"kubernetes.io/projected/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-kube-api-access-x4cfs\") pod \"stf-smoketest-smoke1-ddh68\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.737165 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.738601 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.758797 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.759074 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xkhm\" (UniqueName: \"kubernetes.io/projected/89c7f083-d89e-4576-9a27-adf5b9fdbaf5-kube-api-access-5xkhm\") pod \"curl\" (UID: \"89c7f083-d89e-4576-9a27-adf5b9fdbaf5\") " pod="service-telemetry/curl" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.859934 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xkhm\" (UniqueName: \"kubernetes.io/projected/89c7f083-d89e-4576-9a27-adf5b9fdbaf5-kube-api-access-5xkhm\") pod \"curl\" (UID: \"89c7f083-d89e-4576-9a27-adf5b9fdbaf5\") " pod="service-telemetry/curl" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.875850 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xkhm\" (UniqueName: \"kubernetes.io/projected/89c7f083-d89e-4576-9a27-adf5b9fdbaf5-kube-api-access-5xkhm\") pod \"curl\" (UID: \"89c7f083-d89e-4576-9a27-adf5b9fdbaf5\") " pod="service-telemetry/curl" Mar 20 00:33:06 crc kubenswrapper[4867]: I0320 00:33:06.976927 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:07 crc kubenswrapper[4867]: I0320 00:33:07.059857 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 20 00:33:07 crc kubenswrapper[4867]: I0320 00:33:07.104144 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls" event={"ID":"52172008-2d38-4ef9-81f0-ddafa1f96be7","Type":"ContainerStarted","Data":"7fd215ebfa79250b4e1ebde3929d88611020fc7cf1760eadee28da9ab79bcdc9"} Mar 20 00:33:07 crc kubenswrapper[4867]: I0320 00:33:07.421247 4867 scope.go:117] "RemoveContainer" containerID="d7bc3d4e567f0aa1ecba4d78e849a651af92f74c30019c79e30ed055f362ce3f" Mar 20 00:33:07 crc kubenswrapper[4867]: I0320 00:33:07.422401 4867 scope.go:117] "RemoveContainer" containerID="cb4e6222c5acb79ad9957d48076ea41abc8ac066cd778968263132485f524183" Mar 20 00:33:07 crc kubenswrapper[4867]: I0320 00:33:07.435317 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-ddh68"] Mar 20 00:33:07 crc kubenswrapper[4867]: W0320 00:33:07.460911 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49cf5b16_ce7e_43ce_8e45_2a665f5e0406.slice/crio-eab86ee6185039bff8f1a46fd9eae140f754c19b0a8950faa811c4e19cfbb6f4 WatchSource:0}: Error finding container eab86ee6185039bff8f1a46fd9eae140f754c19b0a8950faa811c4e19cfbb6f4: Status 404 returned error can't find the container with id eab86ee6185039bff8f1a46fd9eae140f754c19b0a8950faa811c4e19cfbb6f4 Mar 20 00:33:07 crc kubenswrapper[4867]: I0320 00:33:07.540300 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 20 00:33:08 crc kubenswrapper[4867]: I0320 00:33:08.111502 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf" event={"ID":"13a8a336-a26c-43c2-9acc-6d03492f9f0e","Type":"ContainerStarted","Data":"086b9ec85aa05943ba59b47005cec640f35336e7ea38970301f1877ce6da6af3"} Mar 20 00:33:08 crc kubenswrapper[4867]: I0320 00:33:08.115340 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6" event={"ID":"7c011887-4b7c-450a-8f02-40c60be82359","Type":"ContainerStarted","Data":"93a71a88935cd2377740c12885e652651c7b48b6147d3eb767b545f97524a276"} Mar 20 00:33:08 crc kubenswrapper[4867]: I0320 00:33:08.118392 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"89c7f083-d89e-4576-9a27-adf5b9fdbaf5","Type":"ContainerStarted","Data":"09512ae76c0979ca382d0e627823bd5b5e7aee790bc77e20944e8b9012fe574f"} Mar 20 00:33:08 crc kubenswrapper[4867]: I0320 00:33:08.119574 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-ddh68" event={"ID":"49cf5b16-ce7e-43ce-8e45-2a665f5e0406","Type":"ContainerStarted","Data":"eab86ee6185039bff8f1a46fd9eae140f754c19b0a8950faa811c4e19cfbb6f4"} Mar 20 00:33:09 crc kubenswrapper[4867]: I0320 00:33:09.422306 4867 scope.go:117] "RemoveContainer" containerID="a1249afea10ed65e194ae7b4499e1334bbfd3519ecc156b3e42047244278bc55" Mar 20 00:33:10 crc kubenswrapper[4867]: I0320 00:33:10.421708 4867 scope.go:117] "RemoveContainer" containerID="0066291be27dd7ff57928c279db066498df708c8e69cb593c6afb28416b71c95" Mar 20 00:33:11 crc kubenswrapper[4867]: I0320 00:33:11.145333 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m" event={"ID":"51581aac-eba3-4506-8565-9085cc1a2ab0","Type":"ContainerStarted","Data":"3dcda498d3a6f55a9806ca36b49998391fc5877b44636f12153a7a4e01ce6788"} Mar 20 00:33:11 crc kubenswrapper[4867]: I0320 00:33:11.152105 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w" event={"ID":"61034c06-0be0-421a-86fe-88414af211f7","Type":"ContainerStarted","Data":"947ac9b7879c056b78663376825cde4c475a934013cb8057d6f99da100fbd6df"} Mar 20 00:33:11 crc kubenswrapper[4867]: I0320 00:33:11.155858 4867 generic.go:334] "Generic (PLEG): container finished" podID="89c7f083-d89e-4576-9a27-adf5b9fdbaf5" containerID="f500ac012185afe88b0a534130d615b0b7a3ca72975cd24da5b989ca6f06a109" exitCode=0 Mar 20 00:33:11 crc kubenswrapper[4867]: I0320 00:33:11.155923 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"89c7f083-d89e-4576-9a27-adf5b9fdbaf5","Type":"ContainerDied","Data":"f500ac012185afe88b0a534130d615b0b7a3ca72975cd24da5b989ca6f06a109"} Mar 20 00:33:18 crc kubenswrapper[4867]: I0320 00:33:18.410396 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 20 00:33:18 crc kubenswrapper[4867]: I0320 00:33:18.537684 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xkhm\" (UniqueName: \"kubernetes.io/projected/89c7f083-d89e-4576-9a27-adf5b9fdbaf5-kube-api-access-5xkhm\") pod \"89c7f083-d89e-4576-9a27-adf5b9fdbaf5\" (UID: \"89c7f083-d89e-4576-9a27-adf5b9fdbaf5\") " Mar 20 00:33:18 crc kubenswrapper[4867]: I0320 00:33:18.558669 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89c7f083-d89e-4576-9a27-adf5b9fdbaf5-kube-api-access-5xkhm" (OuterVolumeSpecName: "kube-api-access-5xkhm") pod "89c7f083-d89e-4576-9a27-adf5b9fdbaf5" (UID: "89c7f083-d89e-4576-9a27-adf5b9fdbaf5"). InnerVolumeSpecName "kube-api-access-5xkhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:33:18 crc kubenswrapper[4867]: I0320 00:33:18.576480 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_89c7f083-d89e-4576-9a27-adf5b9fdbaf5/curl/0.log" Mar 20 00:33:18 crc kubenswrapper[4867]: I0320 00:33:18.642627 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xkhm\" (UniqueName: \"kubernetes.io/projected/89c7f083-d89e-4576-9a27-adf5b9fdbaf5-kube-api-access-5xkhm\") on node \"crc\" DevicePath \"\"" Mar 20 00:33:18 crc kubenswrapper[4867]: I0320 00:33:18.878718 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-lgfg8_18484175-1344-4a70-8e0f-60b1adaecae8/prometheus-webhook-snmp/0.log" Mar 20 00:33:19 crc kubenswrapper[4867]: I0320 00:33:19.214136 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"89c7f083-d89e-4576-9a27-adf5b9fdbaf5","Type":"ContainerDied","Data":"09512ae76c0979ca382d0e627823bd5b5e7aee790bc77e20944e8b9012fe574f"} Mar 20 00:33:19 crc kubenswrapper[4867]: I0320 00:33:19.214171 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09512ae76c0979ca382d0e627823bd5b5e7aee790bc77e20944e8b9012fe574f" Mar 20 00:33:19 crc kubenswrapper[4867]: I0320 00:33:19.214220 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 20 00:33:20 crc kubenswrapper[4867]: I0320 00:33:20.225843 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-ddh68" event={"ID":"49cf5b16-ce7e-43ce-8e45-2a665f5e0406","Type":"ContainerStarted","Data":"5e4e190f68ea88f3cd0eb91be6b9a4bf5247b0a2eb033a2ee15d4102104ac256"} Mar 20 00:33:26 crc kubenswrapper[4867]: I0320 00:33:26.285014 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-ddh68" event={"ID":"49cf5b16-ce7e-43ce-8e45-2a665f5e0406","Type":"ContainerStarted","Data":"e243c0c5c3cb80ffafe74140e01d9624d6de19ec482cbdcf7c2eb0abab0e8647"} Mar 20 00:33:26 crc kubenswrapper[4867]: I0320 00:33:26.305382 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-ddh68" podStartSLOduration=2.194504849 podStartE2EDuration="20.305358528s" podCreationTimestamp="2026-03-20 00:33:06 +0000 UTC" firstStartedPulling="2026-03-20 00:33:07.485138136 +0000 UTC m=+1601.711675653" lastFinishedPulling="2026-03-20 00:33:25.595991815 +0000 UTC m=+1619.822529332" observedRunningTime="2026-03-20 00:33:26.301758674 +0000 UTC m=+1620.528296221" watchObservedRunningTime="2026-03-20 00:33:26.305358528 +0000 UTC m=+1620.531896085" Mar 20 00:33:49 crc kubenswrapper[4867]: I0320 00:33:49.076581 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-lgfg8_18484175-1344-4a70-8e0f-60b1adaecae8/prometheus-webhook-snmp/0.log" Mar 20 00:33:54 crc kubenswrapper[4867]: I0320 00:33:54.507217 4867 generic.go:334] "Generic (PLEG): container finished" podID="49cf5b16-ce7e-43ce-8e45-2a665f5e0406" containerID="5e4e190f68ea88f3cd0eb91be6b9a4bf5247b0a2eb033a2ee15d4102104ac256" exitCode=0 Mar 20 00:33:54 crc kubenswrapper[4867]: I0320 00:33:54.507318 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-ddh68" event={"ID":"49cf5b16-ce7e-43ce-8e45-2a665f5e0406","Type":"ContainerDied","Data":"5e4e190f68ea88f3cd0eb91be6b9a4bf5247b0a2eb033a2ee15d4102104ac256"} Mar 20 00:33:54 crc kubenswrapper[4867]: I0320 00:33:54.508613 4867 scope.go:117] "RemoveContainer" containerID="5e4e190f68ea88f3cd0eb91be6b9a4bf5247b0a2eb033a2ee15d4102104ac256" Mar 20 00:33:57 crc kubenswrapper[4867]: I0320 00:33:57.536035 4867 generic.go:334] "Generic (PLEG): container finished" podID="49cf5b16-ce7e-43ce-8e45-2a665f5e0406" containerID="e243c0c5c3cb80ffafe74140e01d9624d6de19ec482cbdcf7c2eb0abab0e8647" exitCode=0 Mar 20 00:33:57 crc kubenswrapper[4867]: I0320 00:33:57.536083 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-ddh68" event={"ID":"49cf5b16-ce7e-43ce-8e45-2a665f5e0406","Type":"ContainerDied","Data":"e243c0c5c3cb80ffafe74140e01d9624d6de19ec482cbdcf7c2eb0abab0e8647"} Mar 20 00:33:58 crc kubenswrapper[4867]: I0320 00:33:58.860692 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:33:58 crc kubenswrapper[4867]: I0320 00:33:58.990211 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-collectd-entrypoint-script\") pod \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " Mar 20 00:33:58 crc kubenswrapper[4867]: I0320 00:33:58.990290 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-sensubility-config\") pod \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " Mar 20 00:33:58 crc kubenswrapper[4867]: I0320 00:33:58.990342 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4cfs\" (UniqueName: \"kubernetes.io/projected/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-kube-api-access-x4cfs\") pod \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " Mar 20 00:33:58 crc kubenswrapper[4867]: I0320 00:33:58.990429 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-healthcheck-log\") pod \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " Mar 20 00:33:58 crc kubenswrapper[4867]: I0320 00:33:58.990462 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-collectd-config\") pod \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " Mar 20 00:33:58 crc kubenswrapper[4867]: I0320 00:33:58.990524 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-ceilometer-entrypoint-script\") pod \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " Mar 20 00:33:58 crc kubenswrapper[4867]: I0320 00:33:58.990571 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-ceilometer-publisher\") pod \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\" (UID: \"49cf5b16-ce7e-43ce-8e45-2a665f5e0406\") " Mar 20 00:33:59 crc kubenswrapper[4867]: I0320 00:33:59.004566 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-kube-api-access-x4cfs" (OuterVolumeSpecName: "kube-api-access-x4cfs") pod "49cf5b16-ce7e-43ce-8e45-2a665f5e0406" (UID: "49cf5b16-ce7e-43ce-8e45-2a665f5e0406"). InnerVolumeSpecName "kube-api-access-x4cfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:33:59 crc kubenswrapper[4867]: I0320 00:33:59.013656 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "49cf5b16-ce7e-43ce-8e45-2a665f5e0406" (UID: "49cf5b16-ce7e-43ce-8e45-2a665f5e0406"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:33:59 crc kubenswrapper[4867]: I0320 00:33:59.014356 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "49cf5b16-ce7e-43ce-8e45-2a665f5e0406" (UID: "49cf5b16-ce7e-43ce-8e45-2a665f5e0406"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:33:59 crc kubenswrapper[4867]: I0320 00:33:59.014760 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "49cf5b16-ce7e-43ce-8e45-2a665f5e0406" (UID: "49cf5b16-ce7e-43ce-8e45-2a665f5e0406"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:33:59 crc kubenswrapper[4867]: I0320 00:33:59.018029 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "49cf5b16-ce7e-43ce-8e45-2a665f5e0406" (UID: "49cf5b16-ce7e-43ce-8e45-2a665f5e0406"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:33:59 crc kubenswrapper[4867]: I0320 00:33:59.019538 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "49cf5b16-ce7e-43ce-8e45-2a665f5e0406" (UID: "49cf5b16-ce7e-43ce-8e45-2a665f5e0406"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:33:59 crc kubenswrapper[4867]: I0320 00:33:59.025013 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "49cf5b16-ce7e-43ce-8e45-2a665f5e0406" (UID: "49cf5b16-ce7e-43ce-8e45-2a665f5e0406"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 00:33:59 crc kubenswrapper[4867]: I0320 00:33:59.092106 4867 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-healthcheck-log\") on node \"crc\" DevicePath \"\"" Mar 20 00:33:59 crc kubenswrapper[4867]: I0320 00:33:59.092666 4867 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-collectd-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:33:59 crc kubenswrapper[4867]: I0320 00:33:59.092827 4867 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 20 00:33:59 crc kubenswrapper[4867]: I0320 00:33:59.092964 4867 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Mar 20 00:33:59 crc kubenswrapper[4867]: I0320 00:33:59.093122 4867 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 20 00:33:59 crc kubenswrapper[4867]: I0320 00:33:59.093251 4867 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-sensubility-config\") on node \"crc\" DevicePath \"\"" Mar 20 00:33:59 crc kubenswrapper[4867]: I0320 00:33:59.093372 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4cfs\" (UniqueName: \"kubernetes.io/projected/49cf5b16-ce7e-43ce-8e45-2a665f5e0406-kube-api-access-x4cfs\") on node \"crc\" DevicePath \"\"" Mar 20 00:33:59 crc kubenswrapper[4867]: I0320 00:33:59.557417 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-ddh68" event={"ID":"49cf5b16-ce7e-43ce-8e45-2a665f5e0406","Type":"ContainerDied","Data":"eab86ee6185039bff8f1a46fd9eae140f754c19b0a8950faa811c4e19cfbb6f4"} Mar 20 00:33:59 crc kubenswrapper[4867]: I0320 00:33:59.557464 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eab86ee6185039bff8f1a46fd9eae140f754c19b0a8950faa811c4e19cfbb6f4" Mar 20 00:33:59 crc kubenswrapper[4867]: I0320 00:33:59.557837 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-ddh68" Mar 20 00:34:00 crc kubenswrapper[4867]: I0320 00:34:00.139283 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566114-vh922"] Mar 20 00:34:00 crc kubenswrapper[4867]: E0320 00:34:00.139641 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49cf5b16-ce7e-43ce-8e45-2a665f5e0406" containerName="smoketest-ceilometer" Mar 20 00:34:00 crc kubenswrapper[4867]: I0320 00:34:00.139657 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="49cf5b16-ce7e-43ce-8e45-2a665f5e0406" containerName="smoketest-ceilometer" Mar 20 00:34:00 crc kubenswrapper[4867]: E0320 00:34:00.139689 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49cf5b16-ce7e-43ce-8e45-2a665f5e0406" containerName="smoketest-collectd" Mar 20 00:34:00 crc kubenswrapper[4867]: I0320 00:34:00.139697 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="49cf5b16-ce7e-43ce-8e45-2a665f5e0406" containerName="smoketest-collectd" Mar 20 00:34:00 crc kubenswrapper[4867]: E0320 00:34:00.139708 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c7f083-d89e-4576-9a27-adf5b9fdbaf5" containerName="curl" Mar 20 00:34:00 crc kubenswrapper[4867]: I0320 00:34:00.139716 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c7f083-d89e-4576-9a27-adf5b9fdbaf5" containerName="curl" Mar 20 00:34:00 crc kubenswrapper[4867]: I0320 00:34:00.139856 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="49cf5b16-ce7e-43ce-8e45-2a665f5e0406" containerName="smoketest-collectd" Mar 20 00:34:00 crc kubenswrapper[4867]: I0320 00:34:00.139891 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="89c7f083-d89e-4576-9a27-adf5b9fdbaf5" containerName="curl" Mar 20 00:34:00 crc kubenswrapper[4867]: I0320 00:34:00.139904 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="49cf5b16-ce7e-43ce-8e45-2a665f5e0406" containerName="smoketest-ceilometer" Mar 20 00:34:00 crc kubenswrapper[4867]: I0320 00:34:00.140419 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566114-vh922" Mar 20 00:34:00 crc kubenswrapper[4867]: I0320 00:34:00.142434 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 00:34:00 crc kubenswrapper[4867]: I0320 00:34:00.142620 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 00:34:00 crc kubenswrapper[4867]: I0320 00:34:00.142668 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lxzm5" Mar 20 00:34:00 crc kubenswrapper[4867]: I0320 00:34:00.161237 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566114-vh922"] Mar 20 00:34:00 crc kubenswrapper[4867]: I0320 00:34:00.208715 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9z8k\" (UniqueName: \"kubernetes.io/projected/26f6250a-0ede-42ce-9b60-501731111078-kube-api-access-r9z8k\") pod \"auto-csr-approver-29566114-vh922\" (UID: \"26f6250a-0ede-42ce-9b60-501731111078\") " pod="openshift-infra/auto-csr-approver-29566114-vh922" Mar 20 00:34:00 crc kubenswrapper[4867]: I0320 00:34:00.309896 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9z8k\" (UniqueName: \"kubernetes.io/projected/26f6250a-0ede-42ce-9b60-501731111078-kube-api-access-r9z8k\") pod \"auto-csr-approver-29566114-vh922\" (UID: \"26f6250a-0ede-42ce-9b60-501731111078\") " pod="openshift-infra/auto-csr-approver-29566114-vh922" Mar 20 00:34:00 crc kubenswrapper[4867]: I0320 00:34:00.350435 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9z8k\" (UniqueName: \"kubernetes.io/projected/26f6250a-0ede-42ce-9b60-501731111078-kube-api-access-r9z8k\") pod \"auto-csr-approver-29566114-vh922\" (UID: \"26f6250a-0ede-42ce-9b60-501731111078\") " pod="openshift-infra/auto-csr-approver-29566114-vh922" Mar 20 00:34:00 crc kubenswrapper[4867]: I0320 00:34:00.466574 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566114-vh922" Mar 20 00:34:00 crc kubenswrapper[4867]: I0320 00:34:00.881556 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-ddh68_49cf5b16-ce7e-43ce-8e45-2a665f5e0406/smoketest-collectd/0.log" Mar 20 00:34:00 crc kubenswrapper[4867]: I0320 00:34:00.898825 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566114-vh922"] Mar 20 00:34:01 crc kubenswrapper[4867]: I0320 00:34:01.227233 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-ddh68_49cf5b16-ce7e-43ce-8e45-2a665f5e0406/smoketest-ceilometer/0.log" Mar 20 00:34:01 crc kubenswrapper[4867]: I0320 00:34:01.573150 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566114-vh922" event={"ID":"26f6250a-0ede-42ce-9b60-501731111078","Type":"ContainerStarted","Data":"f3286143507ad6ae2b1dba4f60b2bb8b4e08c08b306372daa955ed0cb247599c"} Mar 20 00:34:01 crc kubenswrapper[4867]: I0320 00:34:01.579676 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-l6fnr_063672f7-25c6-47fb-a6fc-8149c97062d5/default-interconnect/0.log" Mar 20 00:34:01 crc kubenswrapper[4867]: I0320 00:34:01.897295 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m_51581aac-eba3-4506-8565-9085cc1a2ab0/bridge/2.log" Mar 20 00:34:02 crc kubenswrapper[4867]: I0320 00:34:02.195756 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-fwh6m_51581aac-eba3-4506-8565-9085cc1a2ab0/sg-core/0.log" Mar 20 00:34:02 crc kubenswrapper[4867]: I0320 00:34:02.493056 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf_13a8a336-a26c-43c2-9acc-6d03492f9f0e/bridge/2.log" Mar 20 00:34:02 crc kubenswrapper[4867]: I0320 00:34:02.584364 4867 generic.go:334] "Generic (PLEG): container finished" podID="26f6250a-0ede-42ce-9b60-501731111078" containerID="6417d51eda5ea900ef404c16f23b23dd19ee4dad8027073fa412e28b181b9705" exitCode=0 Mar 20 00:34:02 crc kubenswrapper[4867]: I0320 00:34:02.584407 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566114-vh922" event={"ID":"26f6250a-0ede-42ce-9b60-501731111078","Type":"ContainerDied","Data":"6417d51eda5ea900ef404c16f23b23dd19ee4dad8027073fa412e28b181b9705"} Mar 20 00:34:02 crc kubenswrapper[4867]: I0320 00:34:02.760538 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-59b988f5d5-7j8tf_13a8a336-a26c-43c2-9acc-6d03492f9f0e/sg-core/0.log" Mar 20 00:34:03 crc kubenswrapper[4867]: I0320 00:34:03.070641 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w_61034c06-0be0-421a-86fe-88414af211f7/bridge/2.log" Mar 20 00:34:03 crc kubenswrapper[4867]: I0320 00:34:03.369359 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-7c82w_61034c06-0be0-421a-86fe-88414af211f7/sg-core/0.log" Mar 20 00:34:03 crc kubenswrapper[4867]: I0320 00:34:03.687921 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls_52172008-2d38-4ef9-81f0-ddafa1f96be7/bridge/2.log" Mar 20 00:34:03 crc kubenswrapper[4867]: I0320 00:34:03.925854 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566114-vh922" Mar 20 00:34:03 crc kubenswrapper[4867]: I0320 00:34:03.961418 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-6dcdc4b85d-llvls_52172008-2d38-4ef9-81f0-ddafa1f96be7/sg-core/0.log" Mar 20 00:34:04 crc kubenswrapper[4867]: I0320 00:34:04.065182 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9z8k\" (UniqueName: \"kubernetes.io/projected/26f6250a-0ede-42ce-9b60-501731111078-kube-api-access-r9z8k\") pod \"26f6250a-0ede-42ce-9b60-501731111078\" (UID: \"26f6250a-0ede-42ce-9b60-501731111078\") " Mar 20 00:34:04 crc kubenswrapper[4867]: I0320 00:34:04.071195 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26f6250a-0ede-42ce-9b60-501731111078-kube-api-access-r9z8k" (OuterVolumeSpecName: "kube-api-access-r9z8k") pod "26f6250a-0ede-42ce-9b60-501731111078" (UID: "26f6250a-0ede-42ce-9b60-501731111078"). InnerVolumeSpecName "kube-api-access-r9z8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:34:04 crc kubenswrapper[4867]: I0320 00:34:04.167260 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9z8k\" (UniqueName: \"kubernetes.io/projected/26f6250a-0ede-42ce-9b60-501731111078-kube-api-access-r9z8k\") on node \"crc\" DevicePath \"\"" Mar 20 00:34:04 crc kubenswrapper[4867]: I0320 00:34:04.265673 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6_7c011887-4b7c-450a-8f02-40c60be82359/bridge/2.log" Mar 20 00:34:04 crc kubenswrapper[4867]: I0320 00:34:04.551081 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-jxmz6_7c011887-4b7c-450a-8f02-40c60be82359/sg-core/0.log" Mar 20 00:34:04 crc kubenswrapper[4867]: I0320 00:34:04.611398 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566114-vh922" event={"ID":"26f6250a-0ede-42ce-9b60-501731111078","Type":"ContainerDied","Data":"f3286143507ad6ae2b1dba4f60b2bb8b4e08c08b306372daa955ed0cb247599c"} Mar 20 00:34:04 crc kubenswrapper[4867]: I0320 00:34:04.611449 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3286143507ad6ae2b1dba4f60b2bb8b4e08c08b306372daa955ed0cb247599c" Mar 20 00:34:04 crc kubenswrapper[4867]: I0320 00:34:04.611571 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566114-vh922" Mar 20 00:34:04 crc kubenswrapper[4867]: I0320 00:34:04.997061 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566108-78444"] Mar 20 00:34:05 crc kubenswrapper[4867]: I0320 00:34:05.001753 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566108-78444"] Mar 20 00:34:06 crc kubenswrapper[4867]: I0320 00:34:06.433582 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ee1cbc7-9c0b-446c-8a56-d850d57561b9" path="/var/lib/kubelet/pods/1ee1cbc7-9c0b-446c-8a56-d850d57561b9/volumes" Mar 20 00:34:08 crc kubenswrapper[4867]: I0320 00:34:08.355410 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-5fff4dbc4c-bpbl5_7d8af434-ddb1-4216-bf3a-d123252c4325/operator/0.log" Mar 20 00:34:08 crc kubenswrapper[4867]: I0320 00:34:08.575523 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_4cea5088-2867-4484-914c-70644f9eed47/prometheus/0.log" Mar 20 00:34:08 crc kubenswrapper[4867]: I0320 00:34:08.850820 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_f4b5c2ce-c1f8-4340-acd1-699ed169fcfb/elasticsearch/0.log" Mar 20 00:34:09 crc kubenswrapper[4867]: I0320 00:34:09.132533 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-lgfg8_18484175-1344-4a70-8e0f-60b1adaecae8/prometheus-webhook-snmp/0.log" Mar 20 00:34:09 crc kubenswrapper[4867]: I0320 00:34:09.381586 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_a53d90d8-2c87-4869-ad34-dc5db259425d/alertmanager/0.log" Mar 20 00:34:22 crc kubenswrapper[4867]: I0320 00:34:22.360397 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-7474fb55fb-cmvds_44792f23-3a52-46e0-b91d-1e1853b6437f/operator/0.log" Mar 20 00:34:25 crc kubenswrapper[4867]: I0320 00:34:25.426812 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-5fff4dbc4c-bpbl5_7d8af434-ddb1-4216-bf3a-d123252c4325/operator/0.log" Mar 20 00:34:25 crc kubenswrapper[4867]: I0320 00:34:25.688187 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_0fb658d4-90b0-4a29-9892-5baac55ad5e5/qdr/0.log" Mar 20 00:34:39 crc kubenswrapper[4867]: I0320 00:34:39.035735 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gw94c"] Mar 20 00:34:39 crc kubenswrapper[4867]: E0320 00:34:39.036470 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26f6250a-0ede-42ce-9b60-501731111078" containerName="oc" Mar 20 00:34:39 crc kubenswrapper[4867]: I0320 00:34:39.036484 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="26f6250a-0ede-42ce-9b60-501731111078" containerName="oc" Mar 20 00:34:39 crc kubenswrapper[4867]: I0320 00:34:39.036680 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="26f6250a-0ede-42ce-9b60-501731111078" containerName="oc" Mar 20 00:34:39 crc kubenswrapper[4867]: I0320 00:34:39.037797 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gw94c" Mar 20 00:34:39 crc kubenswrapper[4867]: I0320 00:34:39.047851 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gw94c"] Mar 20 00:34:39 crc kubenswrapper[4867]: I0320 00:34:39.113875 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckp8k\" (UniqueName: \"kubernetes.io/projected/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7-kube-api-access-ckp8k\") pod \"community-operators-gw94c\" (UID: \"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7\") " pod="openshift-marketplace/community-operators-gw94c" Mar 20 00:34:39 crc kubenswrapper[4867]: I0320 00:34:39.114049 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7-catalog-content\") pod \"community-operators-gw94c\" (UID: \"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7\") " pod="openshift-marketplace/community-operators-gw94c" Mar 20 00:34:39 crc kubenswrapper[4867]: I0320 00:34:39.114646 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7-utilities\") pod \"community-operators-gw94c\" (UID: \"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7\") " pod="openshift-marketplace/community-operators-gw94c" Mar 20 00:34:39 crc kubenswrapper[4867]: I0320 00:34:39.215589 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7-catalog-content\") pod \"community-operators-gw94c\" (UID: \"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7\") " pod="openshift-marketplace/community-operators-gw94c" Mar 20 00:34:39 crc kubenswrapper[4867]: I0320 00:34:39.215664 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7-utilities\") pod \"community-operators-gw94c\" (UID: \"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7\") " pod="openshift-marketplace/community-operators-gw94c" Mar 20 00:34:39 crc kubenswrapper[4867]: I0320 00:34:39.215704 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckp8k\" (UniqueName: \"kubernetes.io/projected/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7-kube-api-access-ckp8k\") pod \"community-operators-gw94c\" (UID: \"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7\") " pod="openshift-marketplace/community-operators-gw94c" Mar 20 00:34:39 crc kubenswrapper[4867]: I0320 00:34:39.216358 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7-catalog-content\") pod \"community-operators-gw94c\" (UID: \"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7\") " pod="openshift-marketplace/community-operators-gw94c" Mar 20 00:34:39 crc kubenswrapper[4867]: I0320 00:34:39.216595 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7-utilities\") pod \"community-operators-gw94c\" (UID: \"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7\") " pod="openshift-marketplace/community-operators-gw94c" Mar 20 00:34:39 crc kubenswrapper[4867]: I0320 00:34:39.233165 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckp8k\" (UniqueName: \"kubernetes.io/projected/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7-kube-api-access-ckp8k\") pod \"community-operators-gw94c\" (UID: \"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7\") " pod="openshift-marketplace/community-operators-gw94c" Mar 20 00:34:39 crc kubenswrapper[4867]: I0320 00:34:39.363722 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gw94c" Mar 20 00:34:40 crc kubenswrapper[4867]: I0320 00:34:40.590645 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gw94c"] Mar 20 00:34:40 crc kubenswrapper[4867]: I0320 00:34:40.992933 4867 generic.go:334] "Generic (PLEG): container finished" podID="ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7" containerID="bebe927d284b3cbff957b00f098a5716380cb74aebe3845949efe0e01561132a" exitCode=0 Mar 20 00:34:40 crc kubenswrapper[4867]: I0320 00:34:40.993508 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gw94c" event={"ID":"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7","Type":"ContainerDied","Data":"bebe927d284b3cbff957b00f098a5716380cb74aebe3845949efe0e01561132a"} Mar 20 00:34:40 crc kubenswrapper[4867]: I0320 00:34:40.993546 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gw94c" event={"ID":"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7","Type":"ContainerStarted","Data":"25aae79036c2b06796c68bd9fe99513efddfdf2d2b913cb55f41626be9a31d57"} Mar 20 00:34:42 crc kubenswrapper[4867]: I0320 00:34:42.001509 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gw94c" event={"ID":"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7","Type":"ContainerStarted","Data":"1979eeab86c293945141f97f3b69369ec7f073563107e95c41278dda4ee767c4"} Mar 20 00:34:43 crc kubenswrapper[4867]: I0320 00:34:43.011744 4867 generic.go:334] "Generic (PLEG): container finished" podID="ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7" containerID="1979eeab86c293945141f97f3b69369ec7f073563107e95c41278dda4ee767c4" exitCode=0 Mar 20 00:34:43 crc kubenswrapper[4867]: I0320 00:34:43.011795 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gw94c" event={"ID":"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7","Type":"ContainerDied","Data":"1979eeab86c293945141f97f3b69369ec7f073563107e95c41278dda4ee767c4"} Mar 20 00:34:44 crc kubenswrapper[4867]: I0320 00:34:44.026537 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gw94c" event={"ID":"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7","Type":"ContainerStarted","Data":"23687a7f5d4a11d3b7d760d760d9049eb582d5e1eb1104feb9532f4c6e81af67"} Mar 20 00:34:44 crc kubenswrapper[4867]: I0320 00:34:44.057400 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gw94c" podStartSLOduration=2.54470691 podStartE2EDuration="5.057385593s" podCreationTimestamp="2026-03-20 00:34:39 +0000 UTC" firstStartedPulling="2026-03-20 00:34:40.996003946 +0000 UTC m=+1695.222541503" lastFinishedPulling="2026-03-20 00:34:43.508682669 +0000 UTC m=+1697.735220186" observedRunningTime="2026-03-20 00:34:44.054694923 +0000 UTC m=+1698.281232450" watchObservedRunningTime="2026-03-20 00:34:44.057385593 +0000 UTC m=+1698.283923100" Mar 20 00:34:48 crc kubenswrapper[4867]: I0320 00:34:48.859909 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:34:48 crc kubenswrapper[4867]: I0320 00:34:48.860334 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:34:49 crc kubenswrapper[4867]: I0320 00:34:49.364309 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gw94c" Mar 20 00:34:49 crc kubenswrapper[4867]: I0320 00:34:49.364676 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gw94c" Mar 20 00:34:49 crc kubenswrapper[4867]: I0320 00:34:49.415467 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gw94c" Mar 20 00:34:50 crc kubenswrapper[4867]: I0320 00:34:50.167012 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gw94c" Mar 20 00:34:50 crc kubenswrapper[4867]: I0320 00:34:50.204960 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gw94c"] Mar 20 00:34:52 crc kubenswrapper[4867]: I0320 00:34:52.093743 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gw94c" podUID="ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7" containerName="registry-server" containerID="cri-o://23687a7f5d4a11d3b7d760d760d9049eb582d5e1eb1104feb9532f4c6e81af67" gracePeriod=2 Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.101397 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gw94c" Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.101629 4867 generic.go:334] "Generic (PLEG): container finished" podID="ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7" containerID="23687a7f5d4a11d3b7d760d760d9049eb582d5e1eb1104feb9532f4c6e81af67" exitCode=0 Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.101655 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gw94c" event={"ID":"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7","Type":"ContainerDied","Data":"23687a7f5d4a11d3b7d760d760d9049eb582d5e1eb1104feb9532f4c6e81af67"} Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.101866 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gw94c" event={"ID":"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7","Type":"ContainerDied","Data":"25aae79036c2b06796c68bd9fe99513efddfdf2d2b913cb55f41626be9a31d57"} Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.101892 4867 scope.go:117] "RemoveContainer" containerID="23687a7f5d4a11d3b7d760d760d9049eb582d5e1eb1104feb9532f4c6e81af67" Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.130285 4867 scope.go:117] "RemoveContainer" containerID="1979eeab86c293945141f97f3b69369ec7f073563107e95c41278dda4ee767c4" Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.154998 4867 scope.go:117] "RemoveContainer" containerID="bebe927d284b3cbff957b00f098a5716380cb74aebe3845949efe0e01561132a" Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.163330 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckp8k\" (UniqueName: \"kubernetes.io/projected/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7-kube-api-access-ckp8k\") pod \"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7\" (UID: \"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7\") " Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.169847 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7-kube-api-access-ckp8k" (OuterVolumeSpecName: "kube-api-access-ckp8k") pod "ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7" (UID: "ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7"). InnerVolumeSpecName "kube-api-access-ckp8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.173600 4867 scope.go:117] "RemoveContainer" containerID="23687a7f5d4a11d3b7d760d760d9049eb582d5e1eb1104feb9532f4c6e81af67" Mar 20 00:34:53 crc kubenswrapper[4867]: E0320 00:34:53.174187 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23687a7f5d4a11d3b7d760d760d9049eb582d5e1eb1104feb9532f4c6e81af67\": container with ID starting with 23687a7f5d4a11d3b7d760d760d9049eb582d5e1eb1104feb9532f4c6e81af67 not found: ID does not exist" containerID="23687a7f5d4a11d3b7d760d760d9049eb582d5e1eb1104feb9532f4c6e81af67" Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.174255 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23687a7f5d4a11d3b7d760d760d9049eb582d5e1eb1104feb9532f4c6e81af67"} err="failed to get container status \"23687a7f5d4a11d3b7d760d760d9049eb582d5e1eb1104feb9532f4c6e81af67\": rpc error: code = NotFound desc = could not find container \"23687a7f5d4a11d3b7d760d760d9049eb582d5e1eb1104feb9532f4c6e81af67\": container with ID starting with 23687a7f5d4a11d3b7d760d760d9049eb582d5e1eb1104feb9532f4c6e81af67 not found: ID does not exist" Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.174303 4867 scope.go:117] "RemoveContainer" containerID="1979eeab86c293945141f97f3b69369ec7f073563107e95c41278dda4ee767c4" Mar 20 00:34:53 crc kubenswrapper[4867]: E0320 00:34:53.174841 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1979eeab86c293945141f97f3b69369ec7f073563107e95c41278dda4ee767c4\": container with ID starting with 1979eeab86c293945141f97f3b69369ec7f073563107e95c41278dda4ee767c4 not found: ID does not exist" containerID="1979eeab86c293945141f97f3b69369ec7f073563107e95c41278dda4ee767c4" Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.174920 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1979eeab86c293945141f97f3b69369ec7f073563107e95c41278dda4ee767c4"} err="failed to get container status \"1979eeab86c293945141f97f3b69369ec7f073563107e95c41278dda4ee767c4\": rpc error: code = NotFound desc = could not find container \"1979eeab86c293945141f97f3b69369ec7f073563107e95c41278dda4ee767c4\": container with ID starting with 1979eeab86c293945141f97f3b69369ec7f073563107e95c41278dda4ee767c4 not found: ID does not exist" Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.174963 4867 scope.go:117] "RemoveContainer" containerID="bebe927d284b3cbff957b00f098a5716380cb74aebe3845949efe0e01561132a" Mar 20 00:34:53 crc kubenswrapper[4867]: E0320 00:34:53.175323 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bebe927d284b3cbff957b00f098a5716380cb74aebe3845949efe0e01561132a\": container with ID starting with bebe927d284b3cbff957b00f098a5716380cb74aebe3845949efe0e01561132a not found: ID does not exist" containerID="bebe927d284b3cbff957b00f098a5716380cb74aebe3845949efe0e01561132a" Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.175377 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bebe927d284b3cbff957b00f098a5716380cb74aebe3845949efe0e01561132a"} err="failed to get container status \"bebe927d284b3cbff957b00f098a5716380cb74aebe3845949efe0e01561132a\": rpc error: code = NotFound desc = could not find container \"bebe927d284b3cbff957b00f098a5716380cb74aebe3845949efe0e01561132a\": container with ID starting with bebe927d284b3cbff957b00f098a5716380cb74aebe3845949efe0e01561132a not found: ID does not exist" Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.264378 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7-catalog-content\") pod \"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7\" (UID: \"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7\") " Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.264428 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7-utilities\") pod \"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7\" (UID: \"ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7\") " Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.264652 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckp8k\" (UniqueName: \"kubernetes.io/projected/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7-kube-api-access-ckp8k\") on node \"crc\" DevicePath \"\"" Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.265370 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7-utilities" (OuterVolumeSpecName: "utilities") pod "ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7" (UID: "ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.326337 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7" (UID: "ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.365805 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:34:53 crc kubenswrapper[4867]: I0320 00:34:53.365841 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:34:54 crc kubenswrapper[4867]: I0320 00:34:54.109341 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gw94c" Mar 20 00:34:54 crc kubenswrapper[4867]: I0320 00:34:54.136374 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gw94c"] Mar 20 00:34:54 crc kubenswrapper[4867]: I0320 00:34:54.140821 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gw94c"] Mar 20 00:34:54 crc kubenswrapper[4867]: I0320 00:34:54.435385 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7" path="/var/lib/kubelet/pods/ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7/volumes" Mar 20 00:34:54 crc kubenswrapper[4867]: I0320 00:34:54.979775 4867 scope.go:117] "RemoveContainer" containerID="f4e7c692e880a3bf275a26451cfb58488797fca1fe5b363939395239cca67959" Mar 20 00:34:55 crc kubenswrapper[4867]: I0320 00:34:55.003233 4867 scope.go:117] "RemoveContainer" containerID="81c7ac2e7cf52140b5c923ad25f0cbff78951baf2c06df6cef1a1ddc5c938c4a" Mar 20 00:34:55 crc kubenswrapper[4867]: I0320 00:34:55.048078 4867 scope.go:117] "RemoveContainer" containerID="385e6b6152cb79479396db07f999c6c68101ca0cb80daa78acda438c564a4544" Mar 20 00:34:59 crc kubenswrapper[4867]: I0320 00:34:59.915677 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5vqj2/must-gather-gdzrv"] Mar 20 00:34:59 crc kubenswrapper[4867]: E0320 00:34:59.916367 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7" containerName="extract-content" Mar 20 00:34:59 crc kubenswrapper[4867]: I0320 00:34:59.916379 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7" containerName="extract-content" Mar 20 00:34:59 crc kubenswrapper[4867]: E0320 00:34:59.916395 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7" containerName="registry-server" Mar 20 00:34:59 crc kubenswrapper[4867]: I0320 00:34:59.916401 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7" containerName="registry-server" Mar 20 00:34:59 crc kubenswrapper[4867]: E0320 00:34:59.916416 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7" containerName="extract-utilities" Mar 20 00:34:59 crc kubenswrapper[4867]: I0320 00:34:59.916422 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7" containerName="extract-utilities" Mar 20 00:34:59 crc kubenswrapper[4867]: I0320 00:34:59.916554 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea82d0b5-629c-4a69-bbd9-ccd4f601f9f7" containerName="registry-server" Mar 20 00:34:59 crc kubenswrapper[4867]: I0320 00:34:59.917169 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5vqj2/must-gather-gdzrv" Mar 20 00:34:59 crc kubenswrapper[4867]: I0320 00:34:59.919133 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5vqj2"/"kube-root-ca.crt" Mar 20 00:34:59 crc kubenswrapper[4867]: I0320 00:34:59.921046 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5vqj2"/"default-dockercfg-44kkj" Mar 20 00:34:59 crc kubenswrapper[4867]: I0320 00:34:59.924216 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5vqj2"/"openshift-service-ca.crt" Mar 20 00:34:59 crc kubenswrapper[4867]: I0320 00:34:59.934577 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5vqj2/must-gather-gdzrv"] Mar 20 00:35:00 crc kubenswrapper[4867]: I0320 00:35:00.065574 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv6pv\" (UniqueName: \"kubernetes.io/projected/fa5053d6-71ce-40bb-b428-49388b5bad95-kube-api-access-sv6pv\") pod \"must-gather-gdzrv\" (UID: \"fa5053d6-71ce-40bb-b428-49388b5bad95\") " pod="openshift-must-gather-5vqj2/must-gather-gdzrv" Mar 20 00:35:00 crc kubenswrapper[4867]: I0320 00:35:00.065634 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fa5053d6-71ce-40bb-b428-49388b5bad95-must-gather-output\") pod \"must-gather-gdzrv\" (UID: \"fa5053d6-71ce-40bb-b428-49388b5bad95\") " pod="openshift-must-gather-5vqj2/must-gather-gdzrv" Mar 20 00:35:00 crc kubenswrapper[4867]: I0320 00:35:00.167342 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fa5053d6-71ce-40bb-b428-49388b5bad95-must-gather-output\") pod \"must-gather-gdzrv\" (UID: \"fa5053d6-71ce-40bb-b428-49388b5bad95\") " pod="openshift-must-gather-5vqj2/must-gather-gdzrv" Mar 20 00:35:00 crc kubenswrapper[4867]: I0320 00:35:00.167389 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv6pv\" (UniqueName: \"kubernetes.io/projected/fa5053d6-71ce-40bb-b428-49388b5bad95-kube-api-access-sv6pv\") pod \"must-gather-gdzrv\" (UID: \"fa5053d6-71ce-40bb-b428-49388b5bad95\") " pod="openshift-must-gather-5vqj2/must-gather-gdzrv" Mar 20 00:35:00 crc kubenswrapper[4867]: I0320 00:35:00.167913 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fa5053d6-71ce-40bb-b428-49388b5bad95-must-gather-output\") pod \"must-gather-gdzrv\" (UID: \"fa5053d6-71ce-40bb-b428-49388b5bad95\") " pod="openshift-must-gather-5vqj2/must-gather-gdzrv" Mar 20 00:35:00 crc kubenswrapper[4867]: I0320 00:35:00.191848 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv6pv\" (UniqueName: \"kubernetes.io/projected/fa5053d6-71ce-40bb-b428-49388b5bad95-kube-api-access-sv6pv\") pod \"must-gather-gdzrv\" (UID: \"fa5053d6-71ce-40bb-b428-49388b5bad95\") " pod="openshift-must-gather-5vqj2/must-gather-gdzrv" Mar 20 00:35:00 crc kubenswrapper[4867]: I0320 00:35:00.232022 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5vqj2/must-gather-gdzrv" Mar 20 00:35:00 crc kubenswrapper[4867]: I0320 00:35:00.665939 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5vqj2/must-gather-gdzrv"] Mar 20 00:35:01 crc kubenswrapper[4867]: I0320 00:35:01.161016 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5vqj2/must-gather-gdzrv" event={"ID":"fa5053d6-71ce-40bb-b428-49388b5bad95","Type":"ContainerStarted","Data":"20f329a6c9fabca67c53ff18c98b824a89f97b024f1dbaf07733f8fa71f8ceef"} Mar 20 00:35:08 crc kubenswrapper[4867]: I0320 00:35:08.219236 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5vqj2/must-gather-gdzrv" event={"ID":"fa5053d6-71ce-40bb-b428-49388b5bad95","Type":"ContainerStarted","Data":"4391bc1253a77033ecbb69f10983ec452c7d111388c5a2212385a6285310e85c"} Mar 20 00:35:08 crc kubenswrapper[4867]: I0320 00:35:08.219837 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5vqj2/must-gather-gdzrv" event={"ID":"fa5053d6-71ce-40bb-b428-49388b5bad95","Type":"ContainerStarted","Data":"9d1111cb2431120cf5147b2f373d4c285e434836832c499c59053526e3010590"} Mar 20 00:35:18 crc kubenswrapper[4867]: I0320 00:35:18.860757 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:35:18 crc kubenswrapper[4867]: I0320 00:35:18.861355 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:35:48 crc kubenswrapper[4867]: I0320 00:35:48.860037 4867 patch_prober.go:28] interesting pod/machine-config-daemon-v9vbm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 00:35:48 crc kubenswrapper[4867]: I0320 00:35:48.861549 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 00:35:48 crc kubenswrapper[4867]: I0320 00:35:48.861710 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" Mar 20 00:35:48 crc kubenswrapper[4867]: I0320 00:35:48.862562 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5"} pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 00:35:48 crc kubenswrapper[4867]: I0320 00:35:48.862741 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerName="machine-config-daemon" containerID="cri-o://8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" gracePeriod=600 Mar 20 00:35:48 crc kubenswrapper[4867]: E0320 00:35:48.998276 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:35:49 crc kubenswrapper[4867]: I0320 00:35:49.850683 4867 generic.go:334] "Generic (PLEG): container finished" podID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" exitCode=0 Mar 20 00:35:49 crc kubenswrapper[4867]: I0320 00:35:49.850737 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" event={"ID":"00eacbd3-d921-414b-8b8d-c4298bdd5a28","Type":"ContainerDied","Data":"8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5"} Mar 20 00:35:49 crc kubenswrapper[4867]: I0320 00:35:49.851066 4867 scope.go:117] "RemoveContainer" containerID="533398062bfbfbe1fa6b73d9da3e04fbe63602429f4dc45fdc1ed10c2dc75279" Mar 20 00:35:49 crc kubenswrapper[4867]: I0320 00:35:49.851536 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:35:49 crc kubenswrapper[4867]: E0320 00:35:49.851762 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:35:49 crc kubenswrapper[4867]: I0320 00:35:49.880901 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5vqj2/must-gather-gdzrv" podStartSLOduration=44.323013332 podStartE2EDuration="50.880883023s" podCreationTimestamp="2026-03-20 00:34:59 +0000 UTC" firstStartedPulling="2026-03-20 00:35:00.672129836 +0000 UTC m=+1714.898667363" lastFinishedPulling="2026-03-20 00:35:07.229999537 +0000 UTC m=+1721.456537054" observedRunningTime="2026-03-20 00:35:08.240692336 +0000 UTC m=+1722.467229873" watchObservedRunningTime="2026-03-20 00:35:49.880883023 +0000 UTC m=+1764.107420540" Mar 20 00:35:51 crc kubenswrapper[4867]: I0320 00:35:51.394162 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qxlq5_3c795069-a3a9-4cdd-994d-23b51b69f386/control-plane-machine-set-operator/0.log" Mar 20 00:35:51 crc kubenswrapper[4867]: I0320 00:35:51.484008 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-n22zb_f6ed1b0f-b167-4dd9-9cfb-0687dad12d05/kube-rbac-proxy/0.log" Mar 20 00:35:51 crc kubenswrapper[4867]: I0320 00:35:51.527874 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-n22zb_f6ed1b0f-b167-4dd9-9cfb-0687dad12d05/machine-api-operator/0.log" Mar 20 00:36:00 crc kubenswrapper[4867]: I0320 00:36:00.138680 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566116-h9g5c"] Mar 20 00:36:00 crc kubenswrapper[4867]: I0320 00:36:00.141073 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566116-h9g5c" Mar 20 00:36:00 crc kubenswrapper[4867]: I0320 00:36:00.143525 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 00:36:00 crc kubenswrapper[4867]: I0320 00:36:00.143684 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lxzm5" Mar 20 00:36:00 crc kubenswrapper[4867]: I0320 00:36:00.143945 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 00:36:00 crc kubenswrapper[4867]: I0320 00:36:00.154139 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566116-h9g5c"] Mar 20 00:36:00 crc kubenswrapper[4867]: I0320 00:36:00.261485 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szfwl\" (UniqueName: \"kubernetes.io/projected/9db9c149-36c7-457f-b769-13358f70e787-kube-api-access-szfwl\") pod \"auto-csr-approver-29566116-h9g5c\" (UID: \"9db9c149-36c7-457f-b769-13358f70e787\") " pod="openshift-infra/auto-csr-approver-29566116-h9g5c" Mar 20 00:36:00 crc kubenswrapper[4867]: I0320 00:36:00.362817 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szfwl\" (UniqueName: \"kubernetes.io/projected/9db9c149-36c7-457f-b769-13358f70e787-kube-api-access-szfwl\") pod \"auto-csr-approver-29566116-h9g5c\" (UID: \"9db9c149-36c7-457f-b769-13358f70e787\") " pod="openshift-infra/auto-csr-approver-29566116-h9g5c" Mar 20 00:36:00 crc kubenswrapper[4867]: I0320 00:36:00.384812 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szfwl\" (UniqueName: \"kubernetes.io/projected/9db9c149-36c7-457f-b769-13358f70e787-kube-api-access-szfwl\") pod \"auto-csr-approver-29566116-h9g5c\" (UID: \"9db9c149-36c7-457f-b769-13358f70e787\") " pod="openshift-infra/auto-csr-approver-29566116-h9g5c" Mar 20 00:36:00 crc kubenswrapper[4867]: I0320 00:36:00.421675 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:36:00 crc kubenswrapper[4867]: E0320 00:36:00.422120 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:36:00 crc kubenswrapper[4867]: I0320 00:36:00.460954 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566116-h9g5c" Mar 20 00:36:00 crc kubenswrapper[4867]: I0320 00:36:00.877940 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566116-h9g5c"] Mar 20 00:36:00 crc kubenswrapper[4867]: I0320 00:36:00.936992 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566116-h9g5c" event={"ID":"9db9c149-36c7-457f-b769-13358f70e787","Type":"ContainerStarted","Data":"e3083fc44bb91a92fbcf8158115b2399ad3e8f7766b4cb0be79e22fa6c3580ca"} Mar 20 00:36:02 crc kubenswrapper[4867]: I0320 00:36:02.952916 4867 generic.go:334] "Generic (PLEG): container finished" podID="9db9c149-36c7-457f-b769-13358f70e787" containerID="1bf8bbe7e8688e96ece5c14fd0231c5e9fbd5edba5c48d0968309107d0ea4f91" exitCode=0 Mar 20 00:36:02 crc kubenswrapper[4867]: I0320 00:36:02.953109 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566116-h9g5c" event={"ID":"9db9c149-36c7-457f-b769-13358f70e787","Type":"ContainerDied","Data":"1bf8bbe7e8688e96ece5c14fd0231c5e9fbd5edba5c48d0968309107d0ea4f91"} Mar 20 00:36:03 crc kubenswrapper[4867]: I0320 00:36:03.599892 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-hkcm2_673670e9-a11e-4623-b235-b8c8aecbf191/cert-manager-controller/0.log" Mar 20 00:36:03 crc kubenswrapper[4867]: I0320 00:36:03.704991 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-d72zn_edaa424d-60d3-4146-be50-de91d495771c/cert-manager-cainjector/0.log" Mar 20 00:36:03 crc kubenswrapper[4867]: I0320 00:36:03.739096 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-p6hww_6af9c67f-2a80-4c2d-8411-ebb8606657ca/cert-manager-webhook/0.log" Mar 20 00:36:04 crc kubenswrapper[4867]: I0320 00:36:04.183584 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566116-h9g5c" Mar 20 00:36:04 crc kubenswrapper[4867]: I0320 00:36:04.326907 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szfwl\" (UniqueName: \"kubernetes.io/projected/9db9c149-36c7-457f-b769-13358f70e787-kube-api-access-szfwl\") pod \"9db9c149-36c7-457f-b769-13358f70e787\" (UID: \"9db9c149-36c7-457f-b769-13358f70e787\") " Mar 20 00:36:04 crc kubenswrapper[4867]: I0320 00:36:04.333211 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9db9c149-36c7-457f-b769-13358f70e787-kube-api-access-szfwl" (OuterVolumeSpecName: "kube-api-access-szfwl") pod "9db9c149-36c7-457f-b769-13358f70e787" (UID: "9db9c149-36c7-457f-b769-13358f70e787"). InnerVolumeSpecName "kube-api-access-szfwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:36:04 crc kubenswrapper[4867]: I0320 00:36:04.428015 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szfwl\" (UniqueName: \"kubernetes.io/projected/9db9c149-36c7-457f-b769-13358f70e787-kube-api-access-szfwl\") on node \"crc\" DevicePath \"\"" Mar 20 00:36:04 crc kubenswrapper[4867]: I0320 00:36:04.969045 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566116-h9g5c" event={"ID":"9db9c149-36c7-457f-b769-13358f70e787","Type":"ContainerDied","Data":"e3083fc44bb91a92fbcf8158115b2399ad3e8f7766b4cb0be79e22fa6c3580ca"} Mar 20 00:36:04 crc kubenswrapper[4867]: I0320 00:36:04.969292 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3083fc44bb91a92fbcf8158115b2399ad3e8f7766b4cb0be79e22fa6c3580ca" Mar 20 00:36:04 crc kubenswrapper[4867]: I0320 00:36:04.969133 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566116-h9g5c" Mar 20 00:36:05 crc kubenswrapper[4867]: I0320 00:36:05.251854 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566110-wcb2q"] Mar 20 00:36:05 crc kubenswrapper[4867]: I0320 00:36:05.259623 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566110-wcb2q"] Mar 20 00:36:06 crc kubenswrapper[4867]: I0320 00:36:06.432731 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3874203e-db49-4df9-ad3e-2064bc94dfd3" path="/var/lib/kubelet/pods/3874203e-db49-4df9-ad3e-2064bc94dfd3/volumes" Mar 20 00:36:11 crc kubenswrapper[4867]: I0320 00:36:11.421347 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:36:11 crc kubenswrapper[4867]: E0320 00:36:11.421990 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:36:17 crc kubenswrapper[4867]: I0320 00:36:17.217753 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-kzq86_b8b87de0-04fd-464f-8357-df1bed1c47e9/prometheus-operator/0.log" Mar 20 00:36:17 crc kubenswrapper[4867]: I0320 00:36:17.332257 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm_ae3d5dfc-6422-4cea-81fe-b238e1e25562/prometheus-operator-admission-webhook/0.log" Mar 20 00:36:17 crc kubenswrapper[4867]: I0320 00:36:17.436893 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2_1ce0b3f6-ac02-4d16-a2f0-c26ebebe9a09/prometheus-operator-admission-webhook/0.log" Mar 20 00:36:17 crc kubenswrapper[4867]: I0320 00:36:17.485225 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-229gh_a450f0d4-eeea-4cf5-ad11-be8bf9bd91ae/operator/0.log" Mar 20 00:36:17 crc kubenswrapper[4867]: I0320 00:36:17.618795 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-59494d8978-x6l52_2a6984f3-b144-46cf-b343-6d8a75ebb8d7/perses-operator/0.log" Mar 20 00:36:25 crc kubenswrapper[4867]: I0320 00:36:25.421856 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:36:25 crc kubenswrapper[4867]: E0320 00:36:25.422604 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:36:30 crc kubenswrapper[4867]: I0320 00:36:30.362736 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj_918d65ef-006f-4f3a-8736-6af79633c18b/util/0.log" Mar 20 00:36:30 crc kubenswrapper[4867]: I0320 00:36:30.520686 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj_918d65ef-006f-4f3a-8736-6af79633c18b/pull/0.log" Mar 20 00:36:30 crc kubenswrapper[4867]: I0320 00:36:30.540869 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj_918d65ef-006f-4f3a-8736-6af79633c18b/pull/0.log" Mar 20 00:36:30 crc kubenswrapper[4867]: I0320 00:36:30.572920 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj_918d65ef-006f-4f3a-8736-6af79633c18b/util/0.log" Mar 20 00:36:30 crc kubenswrapper[4867]: I0320 00:36:30.739818 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj_918d65ef-006f-4f3a-8736-6af79633c18b/util/0.log" Mar 20 00:36:30 crc kubenswrapper[4867]: I0320 00:36:30.766111 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj_918d65ef-006f-4f3a-8736-6af79633c18b/pull/0.log" Mar 20 00:36:30 crc kubenswrapper[4867]: I0320 00:36:30.766674 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fk62mj_918d65ef-006f-4f3a-8736-6af79633c18b/extract/0.log" Mar 20 00:36:30 crc kubenswrapper[4867]: I0320 00:36:30.881684 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw_be85d851-fa19-4a82-b64c-80a9a261b53b/util/0.log" Mar 20 00:36:31 crc kubenswrapper[4867]: I0320 00:36:31.028225 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw_be85d851-fa19-4a82-b64c-80a9a261b53b/pull/0.log" Mar 20 00:36:31 crc kubenswrapper[4867]: I0320 00:36:31.041372 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw_be85d851-fa19-4a82-b64c-80a9a261b53b/util/0.log" Mar 20 00:36:31 crc kubenswrapper[4867]: I0320 00:36:31.067962 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw_be85d851-fa19-4a82-b64c-80a9a261b53b/pull/0.log" Mar 20 00:36:31 crc kubenswrapper[4867]: I0320 00:36:31.192641 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw_be85d851-fa19-4a82-b64c-80a9a261b53b/util/0.log" Mar 20 00:36:31 crc kubenswrapper[4867]: I0320 00:36:31.200316 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw_be85d851-fa19-4a82-b64c-80a9a261b53b/extract/0.log" Mar 20 00:36:31 crc kubenswrapper[4867]: I0320 00:36:31.222540 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ec59nw_be85d851-fa19-4a82-b64c-80a9a261b53b/pull/0.log" Mar 20 00:36:31 crc kubenswrapper[4867]: I0320 00:36:31.365705 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8_9a1e5e65-8e08-44b2-aac0-2b446cd6d516/util/0.log" Mar 20 00:36:31 crc kubenswrapper[4867]: I0320 00:36:31.527934 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8_9a1e5e65-8e08-44b2-aac0-2b446cd6d516/util/0.log" Mar 20 00:36:31 crc kubenswrapper[4867]: I0320 00:36:31.561428 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8_9a1e5e65-8e08-44b2-aac0-2b446cd6d516/pull/0.log" Mar 20 00:36:31 crc kubenswrapper[4867]: I0320 00:36:31.577037 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8_9a1e5e65-8e08-44b2-aac0-2b446cd6d516/pull/0.log" Mar 20 00:36:31 crc kubenswrapper[4867]: I0320 00:36:31.708969 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8_9a1e5e65-8e08-44b2-aac0-2b446cd6d516/util/0.log" Mar 20 00:36:31 crc kubenswrapper[4867]: I0320 00:36:31.722770 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8_9a1e5e65-8e08-44b2-aac0-2b446cd6d516/extract/0.log" Mar 20 00:36:31 crc kubenswrapper[4867]: I0320 00:36:31.724732 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5mm6m8_9a1e5e65-8e08-44b2-aac0-2b446cd6d516/pull/0.log" Mar 20 00:36:31 crc kubenswrapper[4867]: I0320 00:36:31.861101 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf_4438858b-60d5-4652-842c-c007bad8b04f/util/0.log" Mar 20 00:36:32 crc kubenswrapper[4867]: I0320 00:36:32.040555 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf_4438858b-60d5-4652-842c-c007bad8b04f/pull/0.log" Mar 20 00:36:32 crc kubenswrapper[4867]: I0320 00:36:32.046046 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf_4438858b-60d5-4652-842c-c007bad8b04f/util/0.log" Mar 20 00:36:32 crc kubenswrapper[4867]: I0320 00:36:32.054902 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf_4438858b-60d5-4652-842c-c007bad8b04f/pull/0.log" Mar 20 00:36:32 crc kubenswrapper[4867]: I0320 00:36:32.204834 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf_4438858b-60d5-4652-842c-c007bad8b04f/util/0.log" Mar 20 00:36:32 crc kubenswrapper[4867]: I0320 00:36:32.211905 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf_4438858b-60d5-4652-842c-c007bad8b04f/extract/0.log" Mar 20 00:36:32 crc kubenswrapper[4867]: I0320 00:36:32.217148 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lqbcf_4438858b-60d5-4652-842c-c007bad8b04f/pull/0.log" Mar 20 00:36:32 crc kubenswrapper[4867]: I0320 00:36:32.350925 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zdhfk_de6ba182-4e03-41a5-91fc-89a5528b1d64/extract-utilities/0.log" Mar 20 00:36:32 crc kubenswrapper[4867]: I0320 00:36:32.491076 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zdhfk_de6ba182-4e03-41a5-91fc-89a5528b1d64/extract-content/0.log" Mar 20 00:36:32 crc kubenswrapper[4867]: I0320 00:36:32.498041 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zdhfk_de6ba182-4e03-41a5-91fc-89a5528b1d64/extract-utilities/0.log" Mar 20 00:36:32 crc kubenswrapper[4867]: I0320 00:36:32.505549 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zdhfk_de6ba182-4e03-41a5-91fc-89a5528b1d64/extract-content/0.log" Mar 20 00:36:32 crc kubenswrapper[4867]: I0320 00:36:32.689593 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zdhfk_de6ba182-4e03-41a5-91fc-89a5528b1d64/extract-utilities/0.log" Mar 20 00:36:32 crc kubenswrapper[4867]: I0320 00:36:32.692150 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zdhfk_de6ba182-4e03-41a5-91fc-89a5528b1d64/extract-content/0.log" Mar 20 00:36:32 crc kubenswrapper[4867]: I0320 00:36:32.930793 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ng8hp_339c89c2-adce-4198-aa16-c34cc8e8176a/extract-utilities/0.log" Mar 20 00:36:33 crc kubenswrapper[4867]: I0320 00:36:33.091152 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ng8hp_339c89c2-adce-4198-aa16-c34cc8e8176a/extract-utilities/0.log" Mar 20 00:36:33 crc kubenswrapper[4867]: I0320 00:36:33.107247 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zdhfk_de6ba182-4e03-41a5-91fc-89a5528b1d64/registry-server/0.log" Mar 20 00:36:33 crc kubenswrapper[4867]: I0320 00:36:33.114254 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ng8hp_339c89c2-adce-4198-aa16-c34cc8e8176a/extract-content/0.log" Mar 20 00:36:33 crc kubenswrapper[4867]: I0320 00:36:33.138887 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ng8hp_339c89c2-adce-4198-aa16-c34cc8e8176a/extract-content/0.log" Mar 20 00:36:33 crc kubenswrapper[4867]: I0320 00:36:33.337413 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ng8hp_339c89c2-adce-4198-aa16-c34cc8e8176a/extract-utilities/0.log" Mar 20 00:36:33 crc kubenswrapper[4867]: I0320 00:36:33.343393 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ng8hp_339c89c2-adce-4198-aa16-c34cc8e8176a/extract-content/0.log" Mar 20 00:36:33 crc kubenswrapper[4867]: I0320 00:36:33.582564 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-w9fsp_d521d536-d270-40aa-9c6e-e80b679d1ecd/marketplace-operator/0.log" Mar 20 00:36:33 crc kubenswrapper[4867]: I0320 00:36:33.610424 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7xpgd_accffd80-6502-4252-b9c1-5c6901af4739/extract-utilities/0.log" Mar 20 00:36:33 crc kubenswrapper[4867]: I0320 00:36:33.796542 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ng8hp_339c89c2-adce-4198-aa16-c34cc8e8176a/registry-server/0.log" Mar 20 00:36:33 crc kubenswrapper[4867]: I0320 00:36:33.881142 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7xpgd_accffd80-6502-4252-b9c1-5c6901af4739/extract-content/0.log" Mar 20 00:36:33 crc kubenswrapper[4867]: I0320 00:36:33.881785 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7xpgd_accffd80-6502-4252-b9c1-5c6901af4739/extract-utilities/0.log" Mar 20 00:36:33 crc kubenswrapper[4867]: I0320 00:36:33.902456 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7xpgd_accffd80-6502-4252-b9c1-5c6901af4739/extract-content/0.log" Mar 20 00:36:34 crc kubenswrapper[4867]: I0320 00:36:34.068414 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7xpgd_accffd80-6502-4252-b9c1-5c6901af4739/extract-utilities/0.log" Mar 20 00:36:34 crc kubenswrapper[4867]: I0320 00:36:34.103109 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7xpgd_accffd80-6502-4252-b9c1-5c6901af4739/extract-content/0.log" Mar 20 00:36:34 crc kubenswrapper[4867]: I0320 00:36:34.303794 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7xpgd_accffd80-6502-4252-b9c1-5c6901af4739/registry-server/0.log" Mar 20 00:36:40 crc kubenswrapper[4867]: I0320 00:36:40.421865 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:36:40 crc kubenswrapper[4867]: E0320 00:36:40.422534 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:36:45 crc kubenswrapper[4867]: I0320 00:36:45.970555 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6bb9c7775-vldz2_1ce0b3f6-ac02-4d16-a2f0-c26ebebe9a09/prometheus-operator-admission-webhook/0.log" Mar 20 00:36:45 crc kubenswrapper[4867]: I0320 00:36:45.993660 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-kzq86_b8b87de0-04fd-464f-8357-df1bed1c47e9/prometheus-operator/0.log" Mar 20 00:36:46 crc kubenswrapper[4867]: I0320 00:36:46.031984 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6bb9c7775-fndpm_ae3d5dfc-6422-4cea-81fe-b238e1e25562/prometheus-operator-admission-webhook/0.log" Mar 20 00:36:46 crc kubenswrapper[4867]: I0320 00:36:46.059391 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-229gh_a450f0d4-eeea-4cf5-ad11-be8bf9bd91ae/operator/0.log" Mar 20 00:36:46 crc kubenswrapper[4867]: I0320 00:36:46.186480 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-59494d8978-x6l52_2a6984f3-b144-46cf-b343-6d8a75ebb8d7/perses-operator/0.log" Mar 20 00:36:53 crc kubenswrapper[4867]: I0320 00:36:53.421661 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:36:53 crc kubenswrapper[4867]: E0320 00:36:53.422252 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:36:55 crc kubenswrapper[4867]: I0320 00:36:55.137953 4867 scope.go:117] "RemoveContainer" containerID="174ee34b0906f9f3e1de47e2f3826e69e611717537339e3cdc1605e95fc84925" Mar 20 00:37:07 crc kubenswrapper[4867]: I0320 00:37:07.421341 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:37:07 crc kubenswrapper[4867]: E0320 00:37:07.422113 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:37:21 crc kubenswrapper[4867]: I0320 00:37:21.422158 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:37:21 crc kubenswrapper[4867]: E0320 00:37:21.422848 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:37:33 crc kubenswrapper[4867]: I0320 00:37:33.421225 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:37:33 crc kubenswrapper[4867]: E0320 00:37:33.422342 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:37:38 crc kubenswrapper[4867]: I0320 00:37:38.736530 4867 generic.go:334] "Generic (PLEG): container finished" podID="fa5053d6-71ce-40bb-b428-49388b5bad95" containerID="9d1111cb2431120cf5147b2f373d4c285e434836832c499c59053526e3010590" exitCode=0 Mar 20 00:37:38 crc kubenswrapper[4867]: I0320 00:37:38.736641 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5vqj2/must-gather-gdzrv" event={"ID":"fa5053d6-71ce-40bb-b428-49388b5bad95","Type":"ContainerDied","Data":"9d1111cb2431120cf5147b2f373d4c285e434836832c499c59053526e3010590"} Mar 20 00:37:38 crc kubenswrapper[4867]: I0320 00:37:38.737545 4867 scope.go:117] "RemoveContainer" containerID="9d1111cb2431120cf5147b2f373d4c285e434836832c499c59053526e3010590" Mar 20 00:37:39 crc kubenswrapper[4867]: I0320 00:37:39.294005 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5vqj2_must-gather-gdzrv_fa5053d6-71ce-40bb-b428-49388b5bad95/gather/0.log" Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.256635 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5vqj2/must-gather-gdzrv"] Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.257605 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-5vqj2/must-gather-gdzrv" podUID="fa5053d6-71ce-40bb-b428-49388b5bad95" containerName="copy" containerID="cri-o://4391bc1253a77033ecbb69f10983ec452c7d111388c5a2212385a6285310e85c" gracePeriod=2 Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.265370 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5vqj2/must-gather-gdzrv"] Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.674407 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5vqj2_must-gather-gdzrv_fa5053d6-71ce-40bb-b428-49388b5bad95/copy/0.log" Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.675160 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5vqj2/must-gather-gdzrv" Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.795737 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fa5053d6-71ce-40bb-b428-49388b5bad95-must-gather-output\") pod \"fa5053d6-71ce-40bb-b428-49388b5bad95\" (UID: \"fa5053d6-71ce-40bb-b428-49388b5bad95\") " Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.795871 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv6pv\" (UniqueName: \"kubernetes.io/projected/fa5053d6-71ce-40bb-b428-49388b5bad95-kube-api-access-sv6pv\") pod \"fa5053d6-71ce-40bb-b428-49388b5bad95\" (UID: \"fa5053d6-71ce-40bb-b428-49388b5bad95\") " Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.802207 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa5053d6-71ce-40bb-b428-49388b5bad95-kube-api-access-sv6pv" (OuterVolumeSpecName: "kube-api-access-sv6pv") pod "fa5053d6-71ce-40bb-b428-49388b5bad95" (UID: "fa5053d6-71ce-40bb-b428-49388b5bad95"). InnerVolumeSpecName "kube-api-access-sv6pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.819713 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5vqj2_must-gather-gdzrv_fa5053d6-71ce-40bb-b428-49388b5bad95/copy/0.log" Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.820121 4867 generic.go:334] "Generic (PLEG): container finished" podID="fa5053d6-71ce-40bb-b428-49388b5bad95" containerID="4391bc1253a77033ecbb69f10983ec452c7d111388c5a2212385a6285310e85c" exitCode=143 Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.820174 4867 scope.go:117] "RemoveContainer" containerID="4391bc1253a77033ecbb69f10983ec452c7d111388c5a2212385a6285310e85c" Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.820225 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5vqj2/must-gather-gdzrv" Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.850135 4867 scope.go:117] "RemoveContainer" containerID="9d1111cb2431120cf5147b2f373d4c285e434836832c499c59053526e3010590" Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.872250 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa5053d6-71ce-40bb-b428-49388b5bad95-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "fa5053d6-71ce-40bb-b428-49388b5bad95" (UID: "fa5053d6-71ce-40bb-b428-49388b5bad95"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.897645 4867 scope.go:117] "RemoveContainer" containerID="4391bc1253a77033ecbb69f10983ec452c7d111388c5a2212385a6285310e85c" Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.897715 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv6pv\" (UniqueName: \"kubernetes.io/projected/fa5053d6-71ce-40bb-b428-49388b5bad95-kube-api-access-sv6pv\") on node \"crc\" DevicePath \"\"" Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.897753 4867 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fa5053d6-71ce-40bb-b428-49388b5bad95-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 00:37:46 crc kubenswrapper[4867]: E0320 00:37:46.900022 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4391bc1253a77033ecbb69f10983ec452c7d111388c5a2212385a6285310e85c\": container with ID starting with 4391bc1253a77033ecbb69f10983ec452c7d111388c5a2212385a6285310e85c not found: ID does not exist" containerID="4391bc1253a77033ecbb69f10983ec452c7d111388c5a2212385a6285310e85c" Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.900064 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4391bc1253a77033ecbb69f10983ec452c7d111388c5a2212385a6285310e85c"} err="failed to get container status \"4391bc1253a77033ecbb69f10983ec452c7d111388c5a2212385a6285310e85c\": rpc error: code = NotFound desc = could not find container \"4391bc1253a77033ecbb69f10983ec452c7d111388c5a2212385a6285310e85c\": container with ID starting with 4391bc1253a77033ecbb69f10983ec452c7d111388c5a2212385a6285310e85c not found: ID does not exist" Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.900090 4867 scope.go:117] "RemoveContainer" containerID="9d1111cb2431120cf5147b2f373d4c285e434836832c499c59053526e3010590" Mar 20 00:37:46 crc kubenswrapper[4867]: E0320 00:37:46.900752 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d1111cb2431120cf5147b2f373d4c285e434836832c499c59053526e3010590\": container with ID starting with 9d1111cb2431120cf5147b2f373d4c285e434836832c499c59053526e3010590 not found: ID does not exist" containerID="9d1111cb2431120cf5147b2f373d4c285e434836832c499c59053526e3010590" Mar 20 00:37:46 crc kubenswrapper[4867]: I0320 00:37:46.900791 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1111cb2431120cf5147b2f373d4c285e434836832c499c59053526e3010590"} err="failed to get container status \"9d1111cb2431120cf5147b2f373d4c285e434836832c499c59053526e3010590\": rpc error: code = NotFound desc = could not find container \"9d1111cb2431120cf5147b2f373d4c285e434836832c499c59053526e3010590\": container with ID starting with 9d1111cb2431120cf5147b2f373d4c285e434836832c499c59053526e3010590 not found: ID does not exist" Mar 20 00:37:48 crc kubenswrapper[4867]: I0320 00:37:48.422256 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:37:48 crc kubenswrapper[4867]: E0320 00:37:48.423404 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:37:48 crc kubenswrapper[4867]: I0320 00:37:48.439908 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa5053d6-71ce-40bb-b428-49388b5bad95" path="/var/lib/kubelet/pods/fa5053d6-71ce-40bb-b428-49388b5bad95/volumes" Mar 20 00:38:00 crc kubenswrapper[4867]: I0320 00:38:00.148622 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566118-8qdm2"] Mar 20 00:38:00 crc kubenswrapper[4867]: E0320 00:38:00.149422 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa5053d6-71ce-40bb-b428-49388b5bad95" containerName="gather" Mar 20 00:38:00 crc kubenswrapper[4867]: I0320 00:38:00.149439 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5053d6-71ce-40bb-b428-49388b5bad95" containerName="gather" Mar 20 00:38:00 crc kubenswrapper[4867]: E0320 00:38:00.149460 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9db9c149-36c7-457f-b769-13358f70e787" containerName="oc" Mar 20 00:38:00 crc kubenswrapper[4867]: I0320 00:38:00.149469 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db9c149-36c7-457f-b769-13358f70e787" containerName="oc" Mar 20 00:38:00 crc kubenswrapper[4867]: E0320 00:38:00.149521 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa5053d6-71ce-40bb-b428-49388b5bad95" containerName="copy" Mar 20 00:38:00 crc kubenswrapper[4867]: I0320 00:38:00.149530 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5053d6-71ce-40bb-b428-49388b5bad95" containerName="copy" Mar 20 00:38:00 crc kubenswrapper[4867]: I0320 00:38:00.149674 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa5053d6-71ce-40bb-b428-49388b5bad95" containerName="copy" Mar 20 00:38:00 crc kubenswrapper[4867]: I0320 00:38:00.149687 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa5053d6-71ce-40bb-b428-49388b5bad95" containerName="gather" Mar 20 00:38:00 crc kubenswrapper[4867]: I0320 00:38:00.149700 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="9db9c149-36c7-457f-b769-13358f70e787" containerName="oc" Mar 20 00:38:00 crc kubenswrapper[4867]: I0320 00:38:00.150229 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566118-8qdm2" Mar 20 00:38:00 crc kubenswrapper[4867]: I0320 00:38:00.153032 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 00:38:00 crc kubenswrapper[4867]: I0320 00:38:00.154318 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 00:38:00 crc kubenswrapper[4867]: I0320 00:38:00.155537 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lxzm5" Mar 20 00:38:00 crc kubenswrapper[4867]: I0320 00:38:00.163375 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566118-8qdm2"] Mar 20 00:38:00 crc kubenswrapper[4867]: I0320 00:38:00.199241 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwh8n\" (UniqueName: \"kubernetes.io/projected/1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8-kube-api-access-zwh8n\") pod \"auto-csr-approver-29566118-8qdm2\" (UID: \"1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8\") " pod="openshift-infra/auto-csr-approver-29566118-8qdm2" Mar 20 00:38:00 crc kubenswrapper[4867]: I0320 00:38:00.300834 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwh8n\" (UniqueName: \"kubernetes.io/projected/1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8-kube-api-access-zwh8n\") pod \"auto-csr-approver-29566118-8qdm2\" (UID: \"1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8\") " pod="openshift-infra/auto-csr-approver-29566118-8qdm2" Mar 20 00:38:00 crc kubenswrapper[4867]: I0320 00:38:00.332942 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwh8n\" (UniqueName: \"kubernetes.io/projected/1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8-kube-api-access-zwh8n\") pod \"auto-csr-approver-29566118-8qdm2\" (UID: \"1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8\") " pod="openshift-infra/auto-csr-approver-29566118-8qdm2" Mar 20 00:38:00 crc kubenswrapper[4867]: I0320 00:38:00.473080 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566118-8qdm2" Mar 20 00:38:00 crc kubenswrapper[4867]: I0320 00:38:00.908057 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566118-8qdm2"] Mar 20 00:38:00 crc kubenswrapper[4867]: I0320 00:38:00.946988 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566118-8qdm2" event={"ID":"1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8","Type":"ContainerStarted","Data":"6fa63d96a30695c32bba62a0cae43e4b73a749ad375160a6102f6b00f46661cc"} Mar 20 00:38:02 crc kubenswrapper[4867]: I0320 00:38:02.421351 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:38:02 crc kubenswrapper[4867]: E0320 00:38:02.421937 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:38:02 crc kubenswrapper[4867]: I0320 00:38:02.963444 4867 generic.go:334] "Generic (PLEG): container finished" podID="1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8" containerID="d5d017eacc1cb66c148ce9a5eb489558499a0d8fed06e6253045a0a475cb46b9" exitCode=0 Mar 20 00:38:02 crc kubenswrapper[4867]: I0320 00:38:02.963538 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566118-8qdm2" event={"ID":"1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8","Type":"ContainerDied","Data":"d5d017eacc1cb66c148ce9a5eb489558499a0d8fed06e6253045a0a475cb46b9"} Mar 20 00:38:04 crc kubenswrapper[4867]: I0320 00:38:04.291922 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566118-8qdm2" Mar 20 00:38:04 crc kubenswrapper[4867]: I0320 00:38:04.479797 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwh8n\" (UniqueName: \"kubernetes.io/projected/1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8-kube-api-access-zwh8n\") pod \"1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8\" (UID: \"1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8\") " Mar 20 00:38:04 crc kubenswrapper[4867]: I0320 00:38:04.487904 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8-kube-api-access-zwh8n" (OuterVolumeSpecName: "kube-api-access-zwh8n") pod "1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8" (UID: "1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8"). InnerVolumeSpecName "kube-api-access-zwh8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:38:04 crc kubenswrapper[4867]: I0320 00:38:04.582027 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwh8n\" (UniqueName: \"kubernetes.io/projected/1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8-kube-api-access-zwh8n\") on node \"crc\" DevicePath \"\"" Mar 20 00:38:04 crc kubenswrapper[4867]: I0320 00:38:04.988698 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566118-8qdm2" event={"ID":"1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8","Type":"ContainerDied","Data":"6fa63d96a30695c32bba62a0cae43e4b73a749ad375160a6102f6b00f46661cc"} Mar 20 00:38:04 crc kubenswrapper[4867]: I0320 00:38:04.988766 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fa63d96a30695c32bba62a0cae43e4b73a749ad375160a6102f6b00f46661cc" Mar 20 00:38:04 crc kubenswrapper[4867]: I0320 00:38:04.988818 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566118-8qdm2" Mar 20 00:38:05 crc kubenswrapper[4867]: I0320 00:38:05.369428 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566112-xv5p6"] Mar 20 00:38:05 crc kubenswrapper[4867]: I0320 00:38:05.385408 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566112-xv5p6"] Mar 20 00:38:06 crc kubenswrapper[4867]: I0320 00:38:06.440467 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a980d9d-47a1-4275-99e9-0677f0baff73" path="/var/lib/kubelet/pods/7a980d9d-47a1-4275-99e9-0677f0baff73/volumes" Mar 20 00:38:16 crc kubenswrapper[4867]: I0320 00:38:16.428189 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:38:16 crc kubenswrapper[4867]: E0320 00:38:16.429545 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:38:27 crc kubenswrapper[4867]: I0320 00:38:27.421298 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:38:27 crc kubenswrapper[4867]: E0320 00:38:27.422186 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:38:41 crc kubenswrapper[4867]: I0320 00:38:41.422619 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:38:41 crc kubenswrapper[4867]: E0320 00:38:41.423790 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:38:54 crc kubenswrapper[4867]: I0320 00:38:54.421764 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:38:54 crc kubenswrapper[4867]: E0320 00:38:54.423027 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:38:55 crc kubenswrapper[4867]: I0320 00:38:55.235168 4867 scope.go:117] "RemoveContainer" containerID="e0f9c774f98d8a4bea376fa9477732776c21be04db0a52f63c4a121fdd47a6fa" Mar 20 00:39:08 crc kubenswrapper[4867]: I0320 00:39:08.422432 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:39:08 crc kubenswrapper[4867]: E0320 00:39:08.423718 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:39:22 crc kubenswrapper[4867]: I0320 00:39:22.422653 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:39:22 crc kubenswrapper[4867]: E0320 00:39:22.424655 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:39:37 crc kubenswrapper[4867]: I0320 00:39:37.422210 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:39:37 crc kubenswrapper[4867]: E0320 00:39:37.423457 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:39:52 crc kubenswrapper[4867]: I0320 00:39:52.422333 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:39:52 crc kubenswrapper[4867]: E0320 00:39:52.423315 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:40:00 crc kubenswrapper[4867]: I0320 00:40:00.137267 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566120-hcrpj"] Mar 20 00:40:00 crc kubenswrapper[4867]: E0320 00:40:00.138076 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8" containerName="oc" Mar 20 00:40:00 crc kubenswrapper[4867]: I0320 00:40:00.138089 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8" containerName="oc" Mar 20 00:40:00 crc kubenswrapper[4867]: I0320 00:40:00.138199 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca5fd7a-8826-4fd1-a051-2af3b6ec58d8" containerName="oc" Mar 20 00:40:00 crc kubenswrapper[4867]: I0320 00:40:00.138772 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566120-hcrpj" Mar 20 00:40:00 crc kubenswrapper[4867]: I0320 00:40:00.141059 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lxzm5" Mar 20 00:40:00 crc kubenswrapper[4867]: I0320 00:40:00.141089 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 00:40:00 crc kubenswrapper[4867]: I0320 00:40:00.141540 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 00:40:00 crc kubenswrapper[4867]: I0320 00:40:00.142986 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566120-hcrpj"] Mar 20 00:40:00 crc kubenswrapper[4867]: I0320 00:40:00.326091 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdmrd\" (UniqueName: \"kubernetes.io/projected/66f75f5d-b2ee-4f50-9745-46bd1ee8cca2-kube-api-access-sdmrd\") pod \"auto-csr-approver-29566120-hcrpj\" (UID: \"66f75f5d-b2ee-4f50-9745-46bd1ee8cca2\") " pod="openshift-infra/auto-csr-approver-29566120-hcrpj" Mar 20 00:40:00 crc kubenswrapper[4867]: I0320 00:40:00.427913 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdmrd\" (UniqueName: \"kubernetes.io/projected/66f75f5d-b2ee-4f50-9745-46bd1ee8cca2-kube-api-access-sdmrd\") pod \"auto-csr-approver-29566120-hcrpj\" (UID: \"66f75f5d-b2ee-4f50-9745-46bd1ee8cca2\") " pod="openshift-infra/auto-csr-approver-29566120-hcrpj" Mar 20 00:40:00 crc kubenswrapper[4867]: I0320 00:40:00.458149 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdmrd\" (UniqueName: \"kubernetes.io/projected/66f75f5d-b2ee-4f50-9745-46bd1ee8cca2-kube-api-access-sdmrd\") pod \"auto-csr-approver-29566120-hcrpj\" (UID: \"66f75f5d-b2ee-4f50-9745-46bd1ee8cca2\") " pod="openshift-infra/auto-csr-approver-29566120-hcrpj" Mar 20 00:40:00 crc kubenswrapper[4867]: I0320 00:40:00.461952 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566120-hcrpj" Mar 20 00:40:00 crc kubenswrapper[4867]: I0320 00:40:00.682084 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566120-hcrpj"] Mar 20 00:40:00 crc kubenswrapper[4867]: W0320 00:40:00.689786 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66f75f5d_b2ee_4f50_9745_46bd1ee8cca2.slice/crio-145d10d09a2002ad615092a9c78c4cec3fd457df971f98b241f322feb1e6e7ad WatchSource:0}: Error finding container 145d10d09a2002ad615092a9c78c4cec3fd457df971f98b241f322feb1e6e7ad: Status 404 returned error can't find the container with id 145d10d09a2002ad615092a9c78c4cec3fd457df971f98b241f322feb1e6e7ad Mar 20 00:40:00 crc kubenswrapper[4867]: I0320 00:40:00.692658 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 00:40:01 crc kubenswrapper[4867]: I0320 00:40:01.106482 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566120-hcrpj" event={"ID":"66f75f5d-b2ee-4f50-9745-46bd1ee8cca2","Type":"ContainerStarted","Data":"145d10d09a2002ad615092a9c78c4cec3fd457df971f98b241f322feb1e6e7ad"} Mar 20 00:40:02 crc kubenswrapper[4867]: I0320 00:40:02.120674 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566120-hcrpj" event={"ID":"66f75f5d-b2ee-4f50-9745-46bd1ee8cca2","Type":"ContainerStarted","Data":"273f4ba703eca740accd2d868b734ee038ccc71fc67b3c4c0a17c1c821e341cd"} Mar 20 00:40:02 crc kubenswrapper[4867]: I0320 00:40:02.150762 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566120-hcrpj" podStartSLOduration=1.255840812 podStartE2EDuration="2.150721965s" podCreationTimestamp="2026-03-20 00:40:00 +0000 UTC" firstStartedPulling="2026-03-20 00:40:00.692443172 +0000 UTC m=+2014.918980689" lastFinishedPulling="2026-03-20 00:40:01.587324295 +0000 UTC m=+2015.813861842" observedRunningTime="2026-03-20 00:40:02.140445929 +0000 UTC m=+2016.366983486" watchObservedRunningTime="2026-03-20 00:40:02.150721965 +0000 UTC m=+2016.377259522" Mar 20 00:40:03 crc kubenswrapper[4867]: I0320 00:40:03.132392 4867 generic.go:334] "Generic (PLEG): container finished" podID="66f75f5d-b2ee-4f50-9745-46bd1ee8cca2" containerID="273f4ba703eca740accd2d868b734ee038ccc71fc67b3c4c0a17c1c821e341cd" exitCode=0 Mar 20 00:40:03 crc kubenswrapper[4867]: I0320 00:40:03.132561 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566120-hcrpj" event={"ID":"66f75f5d-b2ee-4f50-9745-46bd1ee8cca2","Type":"ContainerDied","Data":"273f4ba703eca740accd2d868b734ee038ccc71fc67b3c4c0a17c1c821e341cd"} Mar 20 00:40:04 crc kubenswrapper[4867]: I0320 00:40:04.418017 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566120-hcrpj" Mar 20 00:40:04 crc kubenswrapper[4867]: I0320 00:40:04.598864 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdmrd\" (UniqueName: \"kubernetes.io/projected/66f75f5d-b2ee-4f50-9745-46bd1ee8cca2-kube-api-access-sdmrd\") pod \"66f75f5d-b2ee-4f50-9745-46bd1ee8cca2\" (UID: \"66f75f5d-b2ee-4f50-9745-46bd1ee8cca2\") " Mar 20 00:40:04 crc kubenswrapper[4867]: I0320 00:40:04.606823 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66f75f5d-b2ee-4f50-9745-46bd1ee8cca2-kube-api-access-sdmrd" (OuterVolumeSpecName: "kube-api-access-sdmrd") pod "66f75f5d-b2ee-4f50-9745-46bd1ee8cca2" (UID: "66f75f5d-b2ee-4f50-9745-46bd1ee8cca2"). InnerVolumeSpecName "kube-api-access-sdmrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:40:04 crc kubenswrapper[4867]: I0320 00:40:04.700761 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdmrd\" (UniqueName: \"kubernetes.io/projected/66f75f5d-b2ee-4f50-9745-46bd1ee8cca2-kube-api-access-sdmrd\") on node \"crc\" DevicePath \"\"" Mar 20 00:40:05 crc kubenswrapper[4867]: I0320 00:40:05.153975 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566120-hcrpj" event={"ID":"66f75f5d-b2ee-4f50-9745-46bd1ee8cca2","Type":"ContainerDied","Data":"145d10d09a2002ad615092a9c78c4cec3fd457df971f98b241f322feb1e6e7ad"} Mar 20 00:40:05 crc kubenswrapper[4867]: I0320 00:40:05.154070 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="145d10d09a2002ad615092a9c78c4cec3fd457df971f98b241f322feb1e6e7ad" Mar 20 00:40:05 crc kubenswrapper[4867]: I0320 00:40:05.154098 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566120-hcrpj" Mar 20 00:40:05 crc kubenswrapper[4867]: I0320 00:40:05.236909 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566114-vh922"] Mar 20 00:40:05 crc kubenswrapper[4867]: I0320 00:40:05.246409 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566114-vh922"] Mar 20 00:40:06 crc kubenswrapper[4867]: I0320 00:40:06.432066 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26f6250a-0ede-42ce-9b60-501731111078" path="/var/lib/kubelet/pods/26f6250a-0ede-42ce-9b60-501731111078/volumes" Mar 20 00:40:07 crc kubenswrapper[4867]: I0320 00:40:07.421939 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:40:07 crc kubenswrapper[4867]: E0320 00:40:07.422627 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:40:19 crc kubenswrapper[4867]: I0320 00:40:19.421917 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:40:19 crc kubenswrapper[4867]: E0320 00:40:19.423770 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:40:30 crc kubenswrapper[4867]: I0320 00:40:30.421987 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:40:30 crc kubenswrapper[4867]: E0320 00:40:30.422974 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:40:31 crc kubenswrapper[4867]: I0320 00:40:31.764515 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m87qd"] Mar 20 00:40:31 crc kubenswrapper[4867]: E0320 00:40:31.764887 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66f75f5d-b2ee-4f50-9745-46bd1ee8cca2" containerName="oc" Mar 20 00:40:31 crc kubenswrapper[4867]: I0320 00:40:31.764906 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f75f5d-b2ee-4f50-9745-46bd1ee8cca2" containerName="oc" Mar 20 00:40:31 crc kubenswrapper[4867]: I0320 00:40:31.765103 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="66f75f5d-b2ee-4f50-9745-46bd1ee8cca2" containerName="oc" Mar 20 00:40:31 crc kubenswrapper[4867]: I0320 00:40:31.766547 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m87qd" Mar 20 00:40:31 crc kubenswrapper[4867]: I0320 00:40:31.777408 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m87qd"] Mar 20 00:40:31 crc kubenswrapper[4867]: I0320 00:40:31.940621 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f-utilities\") pod \"certified-operators-m87qd\" (UID: \"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f\") " pod="openshift-marketplace/certified-operators-m87qd" Mar 20 00:40:31 crc kubenswrapper[4867]: I0320 00:40:31.940767 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f-catalog-content\") pod \"certified-operators-m87qd\" (UID: \"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f\") " pod="openshift-marketplace/certified-operators-m87qd" Mar 20 00:40:31 crc kubenswrapper[4867]: I0320 00:40:31.941048 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcmjs\" (UniqueName: \"kubernetes.io/projected/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f-kube-api-access-kcmjs\") pod \"certified-operators-m87qd\" (UID: \"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f\") " pod="openshift-marketplace/certified-operators-m87qd" Mar 20 00:40:32 crc kubenswrapper[4867]: I0320 00:40:32.041993 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcmjs\" (UniqueName: \"kubernetes.io/projected/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f-kube-api-access-kcmjs\") pod \"certified-operators-m87qd\" (UID: \"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f\") " pod="openshift-marketplace/certified-operators-m87qd" Mar 20 00:40:32 crc kubenswrapper[4867]: I0320 00:40:32.042080 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f-utilities\") pod \"certified-operators-m87qd\" (UID: \"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f\") " pod="openshift-marketplace/certified-operators-m87qd" Mar 20 00:40:32 crc kubenswrapper[4867]: I0320 00:40:32.042101 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f-catalog-content\") pod \"certified-operators-m87qd\" (UID: \"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f\") " pod="openshift-marketplace/certified-operators-m87qd" Mar 20 00:40:32 crc kubenswrapper[4867]: I0320 00:40:32.042701 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f-utilities\") pod \"certified-operators-m87qd\" (UID: \"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f\") " pod="openshift-marketplace/certified-operators-m87qd" Mar 20 00:40:32 crc kubenswrapper[4867]: I0320 00:40:32.042892 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f-catalog-content\") pod \"certified-operators-m87qd\" (UID: \"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f\") " pod="openshift-marketplace/certified-operators-m87qd" Mar 20 00:40:32 crc kubenswrapper[4867]: I0320 00:40:32.066297 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcmjs\" (UniqueName: \"kubernetes.io/projected/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f-kube-api-access-kcmjs\") pod \"certified-operators-m87qd\" (UID: \"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f\") " pod="openshift-marketplace/certified-operators-m87qd" Mar 20 00:40:32 crc kubenswrapper[4867]: I0320 00:40:32.095128 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m87qd" Mar 20 00:40:32 crc kubenswrapper[4867]: I0320 00:40:32.334991 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m87qd"] Mar 20 00:40:32 crc kubenswrapper[4867]: I0320 00:40:32.383056 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m87qd" event={"ID":"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f","Type":"ContainerStarted","Data":"81e5eed84beb0d0b775cd7e21df08414aedd1329510f64bb080fde6ae99300f2"} Mar 20 00:40:33 crc kubenswrapper[4867]: I0320 00:40:33.393361 4867 generic.go:334] "Generic (PLEG): container finished" podID="c97f3871-ec39-4c25-ac57-c18f4e1f7e1f" containerID="01f268eefc30667a4bdbedbbf70f3002be12212e80c12942332f268e0827a3a1" exitCode=0 Mar 20 00:40:33 crc kubenswrapper[4867]: I0320 00:40:33.393441 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m87qd" event={"ID":"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f","Type":"ContainerDied","Data":"01f268eefc30667a4bdbedbbf70f3002be12212e80c12942332f268e0827a3a1"} Mar 20 00:40:34 crc kubenswrapper[4867]: I0320 00:40:34.405593 4867 generic.go:334] "Generic (PLEG): container finished" podID="c97f3871-ec39-4c25-ac57-c18f4e1f7e1f" containerID="13dee635707f198d5cb455ea88db3549baf9858b6cce7b704935a4e2a12ad03f" exitCode=0 Mar 20 00:40:34 crc kubenswrapper[4867]: I0320 00:40:34.405667 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m87qd" event={"ID":"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f","Type":"ContainerDied","Data":"13dee635707f198d5cb455ea88db3549baf9858b6cce7b704935a4e2a12ad03f"} Mar 20 00:40:35 crc kubenswrapper[4867]: I0320 00:40:35.417185 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m87qd" event={"ID":"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f","Type":"ContainerStarted","Data":"922716aabe294c35f522b69244c1f47e26e486abbea6b9315fb7fe327d32ac71"} Mar 20 00:40:35 crc kubenswrapper[4867]: I0320 00:40:35.443664 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m87qd" podStartSLOduration=2.788737189 podStartE2EDuration="4.443633508s" podCreationTimestamp="2026-03-20 00:40:31 +0000 UTC" firstStartedPulling="2026-03-20 00:40:33.395255751 +0000 UTC m=+2047.621793268" lastFinishedPulling="2026-03-20 00:40:35.05015204 +0000 UTC m=+2049.276689587" observedRunningTime="2026-03-20 00:40:35.43752102 +0000 UTC m=+2049.664058557" watchObservedRunningTime="2026-03-20 00:40:35.443633508 +0000 UTC m=+2049.670171115" Mar 20 00:40:42 crc kubenswrapper[4867]: I0320 00:40:42.096178 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m87qd" Mar 20 00:40:42 crc kubenswrapper[4867]: I0320 00:40:42.096551 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m87qd" Mar 20 00:40:42 crc kubenswrapper[4867]: I0320 00:40:42.173136 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m87qd" Mar 20 00:40:42 crc kubenswrapper[4867]: I0320 00:40:42.563279 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m87qd" Mar 20 00:40:42 crc kubenswrapper[4867]: I0320 00:40:42.624748 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m87qd"] Mar 20 00:40:43 crc kubenswrapper[4867]: I0320 00:40:43.422373 4867 scope.go:117] "RemoveContainer" containerID="8e644c8994c244b69610e9218795c340d974fc67c48021b27d2f81872883b4b5" Mar 20 00:40:43 crc kubenswrapper[4867]: E0320 00:40:43.423314 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v9vbm_openshift-machine-config-operator(00eacbd3-d921-414b-8b8d-c4298bdd5a28)\"" pod="openshift-machine-config-operator/machine-config-daemon-v9vbm" podUID="00eacbd3-d921-414b-8b8d-c4298bdd5a28" Mar 20 00:40:44 crc kubenswrapper[4867]: I0320 00:40:44.506615 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m87qd" podUID="c97f3871-ec39-4c25-ac57-c18f4e1f7e1f" containerName="registry-server" containerID="cri-o://922716aabe294c35f522b69244c1f47e26e486abbea6b9315fb7fe327d32ac71" gracePeriod=2 Mar 20 00:40:44 crc kubenswrapper[4867]: I0320 00:40:44.967731 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m87qd" Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.159955 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f-catalog-content\") pod \"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f\" (UID: \"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f\") " Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.160824 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f-utilities\") pod \"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f\" (UID: \"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f\") " Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.160948 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcmjs\" (UniqueName: \"kubernetes.io/projected/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f-kube-api-access-kcmjs\") pod \"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f\" (UID: \"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f\") " Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.161770 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f-utilities" (OuterVolumeSpecName: "utilities") pod "c97f3871-ec39-4c25-ac57-c18f4e1f7e1f" (UID: "c97f3871-ec39-4c25-ac57-c18f4e1f7e1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.167094 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f-kube-api-access-kcmjs" (OuterVolumeSpecName: "kube-api-access-kcmjs") pod "c97f3871-ec39-4c25-ac57-c18f4e1f7e1f" (UID: "c97f3871-ec39-4c25-ac57-c18f4e1f7e1f"). InnerVolumeSpecName "kube-api-access-kcmjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.263448 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.263530 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcmjs\" (UniqueName: \"kubernetes.io/projected/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f-kube-api-access-kcmjs\") on node \"crc\" DevicePath \"\"" Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.517430 4867 generic.go:334] "Generic (PLEG): container finished" podID="c97f3871-ec39-4c25-ac57-c18f4e1f7e1f" containerID="922716aabe294c35f522b69244c1f47e26e486abbea6b9315fb7fe327d32ac71" exitCode=0 Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.517518 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m87qd" event={"ID":"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f","Type":"ContainerDied","Data":"922716aabe294c35f522b69244c1f47e26e486abbea6b9315fb7fe327d32ac71"} Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.517571 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m87qd" event={"ID":"c97f3871-ec39-4c25-ac57-c18f4e1f7e1f","Type":"ContainerDied","Data":"81e5eed84beb0d0b775cd7e21df08414aedd1329510f64bb080fde6ae99300f2"} Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.517600 4867 scope.go:117] "RemoveContainer" containerID="922716aabe294c35f522b69244c1f47e26e486abbea6b9315fb7fe327d32ac71" Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.517650 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m87qd" Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.539202 4867 scope.go:117] "RemoveContainer" containerID="13dee635707f198d5cb455ea88db3549baf9858b6cce7b704935a4e2a12ad03f" Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.567131 4867 scope.go:117] "RemoveContainer" containerID="01f268eefc30667a4bdbedbbf70f3002be12212e80c12942332f268e0827a3a1" Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.585433 4867 scope.go:117] "RemoveContainer" containerID="922716aabe294c35f522b69244c1f47e26e486abbea6b9315fb7fe327d32ac71" Mar 20 00:40:45 crc kubenswrapper[4867]: E0320 00:40:45.586253 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"922716aabe294c35f522b69244c1f47e26e486abbea6b9315fb7fe327d32ac71\": container with ID starting with 922716aabe294c35f522b69244c1f47e26e486abbea6b9315fb7fe327d32ac71 not found: ID does not exist" containerID="922716aabe294c35f522b69244c1f47e26e486abbea6b9315fb7fe327d32ac71" Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.586322 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922716aabe294c35f522b69244c1f47e26e486abbea6b9315fb7fe327d32ac71"} err="failed to get container status \"922716aabe294c35f522b69244c1f47e26e486abbea6b9315fb7fe327d32ac71\": rpc error: code = NotFound desc = could not find container \"922716aabe294c35f522b69244c1f47e26e486abbea6b9315fb7fe327d32ac71\": container with ID starting with 922716aabe294c35f522b69244c1f47e26e486abbea6b9315fb7fe327d32ac71 not found: ID does not exist" Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.586363 4867 scope.go:117] "RemoveContainer" containerID="13dee635707f198d5cb455ea88db3549baf9858b6cce7b704935a4e2a12ad03f" Mar 20 00:40:45 crc kubenswrapper[4867]: E0320 00:40:45.586924 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13dee635707f198d5cb455ea88db3549baf9858b6cce7b704935a4e2a12ad03f\": container with ID starting with 13dee635707f198d5cb455ea88db3549baf9858b6cce7b704935a4e2a12ad03f not found: ID does not exist" containerID="13dee635707f198d5cb455ea88db3549baf9858b6cce7b704935a4e2a12ad03f" Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.586984 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13dee635707f198d5cb455ea88db3549baf9858b6cce7b704935a4e2a12ad03f"} err="failed to get container status \"13dee635707f198d5cb455ea88db3549baf9858b6cce7b704935a4e2a12ad03f\": rpc error: code = NotFound desc = could not find container \"13dee635707f198d5cb455ea88db3549baf9858b6cce7b704935a4e2a12ad03f\": container with ID starting with 13dee635707f198d5cb455ea88db3549baf9858b6cce7b704935a4e2a12ad03f not found: ID does not exist" Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.587010 4867 scope.go:117] "RemoveContainer" containerID="01f268eefc30667a4bdbedbbf70f3002be12212e80c12942332f268e0827a3a1" Mar 20 00:40:45 crc kubenswrapper[4867]: E0320 00:40:45.587557 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01f268eefc30667a4bdbedbbf70f3002be12212e80c12942332f268e0827a3a1\": container with ID starting with 01f268eefc30667a4bdbedbbf70f3002be12212e80c12942332f268e0827a3a1 not found: ID does not exist" containerID="01f268eefc30667a4bdbedbbf70f3002be12212e80c12942332f268e0827a3a1" Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.587589 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01f268eefc30667a4bdbedbbf70f3002be12212e80c12942332f268e0827a3a1"} err="failed to get container status \"01f268eefc30667a4bdbedbbf70f3002be12212e80c12942332f268e0827a3a1\": rpc error: code = NotFound desc = could not find container \"01f268eefc30667a4bdbedbbf70f3002be12212e80c12942332f268e0827a3a1\": container with ID starting with 01f268eefc30667a4bdbedbbf70f3002be12212e80c12942332f268e0827a3a1 not found: ID does not exist" Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.646434 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c97f3871-ec39-4c25-ac57-c18f4e1f7e1f" (UID: "c97f3871-ec39-4c25-ac57-c18f4e1f7e1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.670104 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.853399 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m87qd"] Mar 20 00:40:45 crc kubenswrapper[4867]: I0320 00:40:45.867385 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m87qd"] Mar 20 00:40:46 crc kubenswrapper[4867]: I0320 00:40:46.432218 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c97f3871-ec39-4c25-ac57-c18f4e1f7e1f" path="/var/lib/kubelet/pods/c97f3871-ec39-4c25-ac57-c18f4e1f7e1f/volumes" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515157113630024447 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015157113631017365 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015157107252016512 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015157107253015463 5ustar corecore